Windows - zalenium file upload - zalenium

I am working in Zalenium, the execution is running. I have a requirement where I have to upload a file from a Windows machine where my dockers are running in a container. Can anyone help? I will explain the scenario in details if I am not clear

You can check "Mounting volumes/folders across containers" under https://opensource.zalando.com/zalenium/#docker, after that you can upload the file referencing the directory inside the container.

Related

Run remote files directly in dockerfile

I am wondering if it is possible to RUN a remote file stored in an NFS share when building an image from a dockerfile.
Currently I am using the COPY command and then the RUN command to execute the files, however many of the files I need to create the image are extremely large.
Is it possible to execute files stored in an NFS share directly in the dockerfile without having to copy them all over?
You can only RUN files inside your container - so it needs to copied to your container.
What you can do is move the COPY commands to the beginning of your Dockerfile so that they are cached and don't need to be copied every time you change a command later in the Dockerfile.
You can RUN curl.... to grab the remote file ,then execute it sure.
But this will only run at image build time, not during lifecycle of the container
You could also mount the NFS volume to your host, then COPY the files.
Otherwise, remote execution is a pretty basic security flaw and shouldn't be possible under any circumstances

Can't get druid running

enter image description here
I have windows with linux subsystem and I am trying to run druid. I am getting a message CANNOT CREATE FIFO. What should I do to avoid it?
I faced the same issue myself. Was trying to run through WSL Ubuntu. It seems that FIFO file can't be created over mounted drive i.e /mnt/c/.
Workaround for this, you'd have to copy the entire Druid installation folder over to any internal folder eg. /usr/share/ and launch from there.

GCE Windows startup Script is not running

I have a simple Django code which I want to keep running on a specific GCE instance. Sometimes the instance gets restarted due to some reasons, not in my control. I created a batch script which I tried to put in Startup folder in both users and common folder. It didn't work. I tried putting the script in using sysprep-specialize-script-url(using cloud storage), sysprep-specialize-script-cmd and sysprep-specialize-script-bat. It didn't work. Here's the content of the batch script -
cd C:\Users\kartik_domadiya\Desktop\happierMiscGoogleCloud
manage.py runserver 0.0.0.0:80
pause
I tried running C:\Program Files\Google\Compute Engine\metadata_scripts\run_startup_scripts.cmd manually and it worked (with any metadata key). So I can see that there's no problem with the script itself.
I even tried with putting the batch script in task scheduler which didn't work too.
So is there any way I can debug the problem and find out why isn't the batch script working? I am using Windows 2012 R2, if that matters.
PS: I know that's a development server and should not be used in production.
I moved the code to C:/code (basically out of any particular user's folder) and then provided all user its access (Right Click > Properties > Security), updated the batch file and put it into startup folder (Run > shell:startup).
It started working after that. I suppose the issue was due to access permission.

Building and running a docker image for a Go executable

I am trying to containerize my Go applicaton. I am using Docker to do this. I have a fully executable Docker application running in my system. To run this in a container i am have created a Dockerfile.
FROM golang:1.7
EXPOSE "portno"
I have kept my dockerfile very simple because i already have an executable file running in my system. Please help me what all contents should i add to get the go app running. I am not able to run the go app as many of the contents are not getting copied in container.
You need to add your executable file to your container using ADD command:
ADD ./app /go/bin/app
And then you need to tell docker that it should be executed as the main container process:
CMD ["/go/bin/app"]
Note that it may be better to build your application from the source code inside your container. It can be done when you build your docker image.
As an example, see this article for more information: http://thenewstack.io/dockerize-go-applications/

openshift DIY, 503 error after deleting and adding again testrubyserver.ruby file

I am trying openshift DIY cartridge. I use a windows system to manage the server from command line. I managed to run a simple html5 website. I have deleted the testrubyserver.ruby file from the webpage folder for test purposed and then added it again to my webfolder. Now i have 503 error. No restart, no stop, no start helps. I am stuck in 503. Does anyone know what to do? How can I make the testrubyserver.ruby run again?
Solved my problem. I checked the log file in the folder: app-root / logs. There I found out that
nohup: failed to run command `/..//testrubyserver.rb': Permission denied
I change in filezilla the permissions for the file from rw to rwx to execute it. Restarted the server and then it worked.
I do not know if this is the right approach. At least it makes my app running again.

Resources