EDIT: Zappa does not use EC2 and not Beanstalk, but Amazon Api Services.
So probably there ist no way.
I have a question for Zappa EC2 Serverless Services.
Launching zappa update dev creates one Zip-File and uploads this. Because my upload is slow, it would be great to have ability to upload only changed files, especially in the beginning (config files etc) of a project. Every single file change needs 5 minutes.
Yup. There is no way.
I had the same problem (slow home upload) and ~solved it so that I'm deploying through a CI system (that has a much much faster upload).
If you are deploying an open source app you can use Circle CI's free tier for open source or for closed source you can take advantage of Gitlab's free CI offering.
Here is my Circle CI config file for my open source project.
It took me 2-3 minutes to just upload .zip from my local computer, now it takes less than a minute to run all the tests and deploy.
For me, the main advantage is that I don't have to nervously watch shell for the next few minutes and can just continue with my work.
Related
I'm using Gatsby.js and gatsby-image to build a website that currently has about 300 images on it. I'm encountering 2 problems:
gatsby develop and gatsby build take many minutes to run because gatsby-image generates multiple resolutions and svg placeholder images for every image on the site. This makes for a great user experience once that pre-optimization is done, but a very slow development experience if I ever need to re-build.
My current workaround is to remove all but a few images during development.
Deploying to GitHub Pages takes too long with so many images (300 base images * 3 resolutions + 1 svg representation). Trying to deploy the website to GitHub pages causes a timeout. Going to attempt deploying to Netlify instead, but I anticipate the same problem. I also don't want to have to re-upload the images every time I make a change on the website.
I don't feel like my <1000 images should qualify as "image heavy", but given poor to mediocre upload speeds, I need a way to upload them incrementally and not re-upload images that have not changed.
Is there a way to upload images separately from the rest of a build for a Gatsby website?
I think maybe I could get something working with AWS S3, manually choosing which files from my build folder I upload when creating a new deploy.
Anyone else dealt with handling an image-heavy Gatsby site? Any recommendations for speeding up my build and deploy process?
Disclaimer: I work for Netlify
Our general recommendation is to do the image optimization locally and check those files into GitHub since it can take longer than our CI allows you (15 minutes) to do all that work and it is repetitive.
There is also an npm module that lets you cache the things you've made alongside your dependencies: https://www.npmjs.com/package/cache-me-outside that may do that for you without abusing GitHub (instead abusing Netlify's cache :))
See another answer from Netlify: smaller source images (as mentioned by #fool) or offloading to a service like Cloudinary or Imgix.
We are new to Windows azure and are developing a Sitefinity web application. In the beginning of the project , we have deployed complete code using Sitefinity Thunder to different environments which actually publish complete code. But now as we are in the middle of development , we are just required to upload any new files created which can be quite less in numbers (1 or 2 or maybe few). Now if we deploy with thunder , it publishes all files and then deploys complete code which takes good amount of time. Is there a way we can deploy only changed or new code files via sitefinity thunder or is there any other way with which we can only upload only the changed files?
Please help.
I use Beyond Compare 3 from scooter software to move files to our different Sitefinity environments. I haven't used Sitefinity Thunder to deploy my sites before. Also, you might want to post your question on the Sitefinity Devs group on Google+. Below is the link.
https://plus.google.com/communities/101682685148530961591
This is not easy to do and Azure is not designed for this although many people have requested this feature. The one way to achieve it is to enable Remote Desktop for the cloud service and then by logging onto the server, you can then make some kind of connection to where your files are stored and copy them into the cloud service. However, it is always possible that the instance will be rebooted and even re-provisioned from fresh so I don't know if there are any guarantees that this is a safe way to do it.
I'm new to Azure. I have a script that automatically installs Apache, Ruby, and configures both to run a basic Ruby on Rails project. This script currently runs on Windows Server.
I'm now trying to get this working in Azure. I've signed up for a subscription, and in Visual Studio I've opened a new Worker Role project.
I'm a little stuck now though.
1) Where should I place the installation files and project files (ruby, apache, etc)?
2) Where is the best place to put the script?
Any help would be appreciated. Thank you for your time :)
Within a Visual Studio project, you have three places to get things up and running:
Startup script. This runs before your workerrole.cs methods get called. It's ideal for installing software requiring elevated permissions, tweaking the registry, etc. For apache, there's no need for elevated permissions - it's just xcopy plus environment variables.
workerrole.cs OnStart() - this handler gets called prior to your role instances being added to the load balancer. You can download your apache zip from blob storage, unzip to a local folder, get it started up.
workerrole.cs Run() - same as OnStart() but your role instances are now in the load balancer. I wouldn't recommend setting up a web server here.
Things are a bit different when setting up tomcat from Eclipse, as there's no workerrole.cs. Instead, you have a startup script. Supplied with the Windows Azure plugin for Eclipse are several sample scripts: one for tomcat, one for JBoss, etc. You can then look at how those sample scripts setup the environment and launch the web server.
One bit of guidance: while you can package tomcat, ruby, and other runtime bits with your deployment, this also grows deployment size. I typically put 3rd-party bits in blob storage and then download them to my role instances at startup. This download is extremely fast. This also affords me the ability to update these bits without needing to redeploy (for instance: tomcat has already gone through a half-dozen incremental updates since I pushed up a deployment a few months ago; I just upload a new tomcat zip and recycle my role instances).
Every time I want to add new code to my site I have to modify the file outside of users view to debug it before updating the real file users see.
I usually create a copy of the file I want to change and test all changes on it but sometimes this files only appear included on another and I have to create two copies and sometimes it becomes even more complicated.
How is this normally done? Are there any tools to simplify the process, for example and enviroment to test my site on my PC so I don't have to upload files to the server each time I update something. Any info about beta testing new features will be thanked.
Most people have a 2nd server (potentially a virtual machine) configured exactly the same as their live (production) website. Where this 2nd server is located is completely up to you, but it should match your live site by using the same versions of software and same file structure.
I also like the idea of a staging server suggested by Sean. Again, your post doesn't say too much about your production web server and all of the features that you're using (are you running scripts on the server? PHP? some version of SQL?). But for a simple setup, you can run a copy of the Apache web server on your own PC, or something a little more lightweight like the XAMPP server.
I'm trying to figure out a way to automate the deployment to our QA environment. The problem is that our release is quite big, so needs to be Zipped, FTP'd and then Unzipped on the QA server. I'm not sure how best to unzip remotely.
I can think of a few options, but none of them sound right:
Use PsExec to execute a remote commandline call on the QA server to unzip the release.
Host a web service on the QA server that unzips the release and copies it to the right place. This service can be called by our release when it's done uploading the files.
Host a windows service on the QA server that monitors a file location and does the unzipping.
None of these are pretty though. I wonder how others have solved this problem?
PS: we use CruiseControl.NET to execute a NAnt script that does the building, zipping and FTP.
Instead of compressing and un-compressing, you can use a tool like rsync; which can transparently compress data during file transfer. The -z option tells rsync to use compression.
But I assume you are in a Windows environment, in which case you could use cwRsync (which is "rsync for Windows").
Depending on your access to the QA box this might not be a viable solution. You'll need to:
install the cwRsync server on the remote machine and
allow the traffic through any firewalls.
At the last place I worked at, we had a guy write a Windows service on the CI box to do the unzipping. TFS Team Server finished the build and notified a service to zip the completed build and copy it to the CI box. The CI box picked up on the new file, and unzipped it. It may have been a bit heavy, but it worked well - and he was cognizant to log all actions to the event log, so it was easy to diagnose if a server had been reset and the service hadn't started.
Update: One thing that we would have liked to improve on that process was to have the service on the CI box check for zip files and uncompressed files that were older than x months, for purging purposes. We routinely ran out of disk space (it was a VM that we rarely looked at), and had to manually purge old builds when it happened.