How to connect a bash or shell to a lxd container - lxd

I have many lxd containers on my system. I cannot access them from network i do not have passwords
Is there a way to attach a bash like we can do on docker for example ?

Enable networking on the remote LXD:
lxc config set core.https_address [::]:8443
lxc config set core.trust_password PASSWORD
Add the target host as a remote (it will ask for PASSWORD):
lxc remote add REMOTE_NAME IP_ADDRESS
Run commands on the remote system:
lxc exec REMOTE_NAME:CONTAINER COMMAND
To get a shell you can pass bash or sh as command in most cases.
More here:
https://web.archive.org/web/20160818010904/https://www.stgraber.org/2016/04/12/lxd-2-0-remote-hosts-and-container-migration-612/

Related

How to do ssh to an ec2 server using Jenkins pipeline with pem file

I am trying to do ssh to an ec2 instance through Jenkins pipeline,using a pem file present on my local system, but I am unable to connect to ec2 instance.
ssh command
ssh -i test.pem -o StrictHostKeyChecking=no ubuntu#ip -p22
I am to able to connect to aws instance through my local machine. I am running jenkins pipeline on master node only. is there any issue with the user of pem file as the username is ubuntu for the pem file not as jenkins?
I am able to resolve this issue by installing the ssh agent plugin in jenkins. After installing ssh agent plugin we need to configure server details under Manage Jenkins -> Manage Credential. Here we need to give the host name, user and passkey which would be your pem file.
While adding the ssh agent into your jenkins pipeline script, one can follow the below approach.
sshagent(credentials : ['id_name_added_underManageCredential']){
sh "ssh command"
}

Connecting to container using ansible

I have build a docker image and started the container using ansible. I'm running into an issue trying to create a dynamic connection to the container from the docker host to set some environment variable and execute some script. I know ansible does not use ssh to connect to the container where I can can use the expect module to run this command "ssh root#localhost -p 12345". How do I add and maintain a connection to the container using ansible docker connection plugin or pointing directly to the docker host? This is all running in AWS EC2 instance.
I think I need to run ansible as an equivalent to this command use by ansible to connect to the container host "docker exec -i -t container_host_server /bin/bash".
name: Create a data container
docker_container:
name: packageserver
image: my_image/image_name
tty: yes
state: started
detach: True
detach: yes
volumes:
/var/www/html
published_ports:
"12345:22"
"80:80"
register: container
Thanks in Advance,
DT
To set environment variables you can use parameter "env" in your docker_container task.
In the docker_container task you can add the parameter "command" to override the command defined as CMD in the Dockerfile of your docker image, somethning like
command: PathToYourScript && sleep infinity
In your example you expose container port 22, so it seems you want run sshd inside container. Although it's not a best practice in Docker, if you want sshd running you have to start that using command parameter in the docker_container task:
command: ['/usr/sbin/sshd', '-D']
Doing it (and having defined a user in the container), you'll be able to connect your container with
ssh -p 12345 user#dockerServer
or, as for your example, "ssh -p 12345 root#localhost" if your image already defined root user and you are working on localhost.

Running a Bash Script from (on Docker Container B) from Docker Container A

I have two Docker Containers configured through a Docker Compose file.
Docker Container A - (teamcity-agent)
Docker Container B - (build-tool)
Both start up fine. But as part of the build process in TeamCity - I would like the Agent (Container A) to run a bash script which is on Docker Container B (Only B can run this script).
I tried to set this up using the SSH build step in Team City, but I get connection refused.
Further reading into it shows that SSH isn't enabled in containers and that I shouldn't really be trying to SSH into a container.
So how can I get Container A to run the script on Container B and see the output of the script on A?
What is the best practice for this?
The only way without modifying the application itself is through SSH. It is completely false you cannot SSH to a container. I use SSH to a database container to run database export inside it.
First be sure openssh-server is installed on B. Then you must setup a passwordless connection between A and B.
Then be sure you link your containers in the docker-compose file so you won't need to expose the SSH port.
Snippet to add in Dockerfile for container B
RUN apt-get install -q -y openssh-server
ADD id_rsa.pub /home/ubuntu/.ssh/authorized_keys
RUN chown -R ubuntu:ubuntu /home/ubuntu/.ssh ; \
chmod 700 /home/ubuntu/.ssh ; \
chmod 600 /home/ubuntu/.ssh/authorized_keys
Also you can run the script outside the containers using docker exec in a crontab in the host. But I think you are not looking for this extreme solution.
I can help you via comments
Regards

Access Docker daemon Remote api on Docker for Mac

I'm runner Docker for OSX, and having trouble getting the Docker remote API to work.
My situation is this:
Docker daemon running natively on OSX (https://www.docker.com/products/docker#/mac, so not the boot2docker variant)
Jenkins running as docker image
No I want to use the Jenkins docker-build-step plugin to build a docker image, but I want it to use the docker daemon on the host machine, so in Jenkins settings, DOCKER_URL should be something like :2375. (Reason for this is I don't want to install docker on the jenkins container if I already have it on my host machine).
Is there a way to to this or is de Docker for Mac currently not supporting this? I tried fiddling with export DOCKER_OPTS or DOCKER_HOST options but still get a Connection refused on calling http://localhost:2375/images/json for example.
Basicly the question is more about enabling the Docker for OSX remote api, with use case calling it from a Jenkins docker container.
You could consider using socat. It solved my problem, which seem to be similar.
socat TCP-LISTEN:2375,reuseaddr,fork UNIX-CONNECT:/var/run/docker.sock &
This allows you to access your macOS host Docker API from a Docker container using: tcp://[host IP address]:2375
On macOS socat can be installed like this:
brew install socat
See here for a long discussion on this topic: Plugin: Docker fails to connect via unix:// on Mac OS X
If you already added an SSH public key to your remote server, then you can use this ssh credentials for your docker connection, too. You don't need to configure the remote api on the server for this approach.
When connecting to macOS Docker Desktop, you could use ssh (after you have enabled it on Mac)
docker -H ssh:user#192.168.64.1 images
or
export DOCKER_HOST=ssh:user#192.168.64.1
docker images
I had the same issue but with mysql. I needed to expose the port of my docker hosts on port 43306 to docker image mysql running on port 3306.
Solution:
Create your docker image with -p parameter.
Example:
#> docker run -p 0.0.0.0:43306:3306 --name mysql-5.7.23xx -e MYSQL_ROOT_PASSWORD=myrootdba -d mysql/mysql-server:5.7.23 --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci
Now I can connect from my host docker server on port 43306 to mysql docker image.

How do I SSH to a Docker in Mac container [duplicate]

This question already has answers here:
docker: SSH access directly into container
(4 answers)
Closed 6 years ago.
I am running Docker for Mac (Version 1.12.0-rc2-beta16 (Build: 9493)).
I have pulled an image from my local repository, and used 'docker run -d' to create a container. Using 'docker ps' I obtained the 'CONTAINER ID', and then used 'docker inspect <CONTAINER_ID>| grep IPA' to obtain the IP address for the running container.
I now want to connect to the container using SSH with 'ssh root#<IP address>' but the command gives the following error: 'Operation timed out'.
Further investigation shows that I can not ping the <IP address> -> 'Request timeout for icmp_seq 0'
How can I connect to the container using SSH? What is the correct command?
UPDATE: This IS NOT a duplicate (as stated above). The entry that begins "The scenario you described" is the correct solution.
The scenario you have described is the approach that would be used on 'normal' Docker.
As Docker on Mac has been created from scratch specifically for the Mac, it has been tweaked to make it easier to use. Therefore, the IP address of the container cannot be used in this way on the Mac.
The documentation Getting Started with Docker for Mac states that:
Previous beta releases used docker as the hostname to build the URL.
From this release forward, ports are exposed on the private IP
addresses of the VM and forwarded to localhost with no other host name
set. See also, Release Notes for Beta 9.
Therefore, the correct way to SSH into a container is to spin it up on Docker for Mac using a port mapping to the SSH port (22). e.g.
docker run -d -p 2022:22 <Image Name>
And the SSH connection is instigated using this command (N.B. it uses 'localhost' on the port specified instead of having to determine and use the container's IP Address):
ssh -p 2022 root#localhost
N.B. It is NOT possible to simply map port 22 to itself i.e. '-p 22:22' as this caused the following error (at least is did for me!):
docker: Error response from daemon: driver failed programming external
connectivity on endpoint pensive_wilson
(2e832b82fc67d3e48864975c6eb02f6c099e34eee64b29634cfde286c41e00a7):
Error starting userland proxy: Failed to bind: EADDRINUSE.
To use bash prompt you could use docker exec -ti %container-name-or-id% /bin/bash. If you want to use ssh and ensure that an ssh daemon is up and running you should expose corresponding ports from the container with -p parameter like this: docker run -d -p 22:22 my_image.

Resources