Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am fairly new and interested in the Go programming language. I have the intention of using it to code a simple website that includes a shopping cart.
How to install golang on windows server? Is it the same process as a regular computer?
What steps do I take to deploy the website using the Windows server once it is finished.
To what extend is it required to use html, css, or javascript?
How can I keep the site running on the server for other users in our network/LAN to access it?
Any helpful information regarding web apps and/or windows server machines is appreciated!
This is as much a general dev ops question as it is specific to Go. A lot of things to consider here and everyone will have varying preferences but here are some guidelines I'd recommend:
It's not necessary to install the Go tool chain on your production server. You then have to maintain your Go installation on both your development and production environments and if your production server is a different OS than your Windows computer (eg. a Linux distribution) this will get out of hand quickly. Instead, just develop on your local and cross compile to the OS of your production server.
One thing you will need to keep on your production server is whatever DB you choose to work with your Go program.
You can then sftp or transfer through a method of your choice your compiled binary over to your production server along with your static web files. Once the binary is on your production server you can fire it up when you're SSH'ed in eg. ./programname
Depending on how you want to use Go, you don't need Apache. Use the net/http package to serve up your Html, CSS and JS files. You can transfer these static files over after you've worked on them on your local or you can just keep them in a Github repo and git pull them from your prod server as needed, assuming you've installed git there.
You generally don't need to worry about keeping it "running" on your production server. http.ListenAndServe listens on your port for incoming requests. If your server reboots or shuts downs you can automatically have your compiled binary start up along with it.
You can also work with things like Vagrant, Virtual Box and Ansible for high quality mimicking of your production environment and spinning up new servers according to your desired specs.
Related
Hello I have been wanting to get into working with a framework and Laravel seems like a decent one to try.
I have seen a lot of tutorials that tell you how to setup Laravel locally with Homestead or variants.
I am wanting to install and setup Laravel on my dedicated remote server with my hosting company. From there I want to be able to work with it on my local MacBook or MacPro.
I have not been able to find a good tutorial to make this happen in the fashion I want to do it.
I work with PHP and related daily but usually login to FTP and edit files with TextWrangler and save them and go about my day so my methods are dated and not efficient.
One side note is that I also have a Dell PowerEdge server running CentOS and VestaCP in my office as my development server so nothing is done locally per say (on my own computer) so the question and answer will apply to both my remote server and my remote but local development server.
Any suggestions are always welcome.
Best Regards,
Bradley
Assuming you have full root access to your remote servers, you should install composer on them and install Laravel in whichever way suits you. Then you can edit your project files just as if you were working on it locally.
Seriously though, the biggest thing you should add to your development arsenal (in case you haven't already) which will make your development process so much more resilient is Git.
Set up a free Bitbucket account, get a free Git client, and learn how commits, pushes, pulls, branches and deployments work. The easiest approach for deployment is to use a service such as Envoyer.
That way you can develop and test locally (even if 'locally' is a remote machine) and not really have to worry about breaking your app by making a mistake in controller or something on the live server.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm trying to figure out how to go about making an easy way to distribute a "dev environment" for working with my organization's Wordpress site. We currently have a local Linux server running the Wordpress site, and a VirtualBox image that is horribly out of date and a very poor representation of this server. We currently distribute this to team members for their local development, which causes lots of problems as the local image is often too different.
I'm not too worried about the database aspect of things; I'm thinking of just doing weekly dumps from the live server which can be imported by developers to keep their local up to date.
I'm more interested in finding an easy to distribute a preconfigured stack to users on OSX or Windows that already has PHP/Apache/MySQL configured by me, a git client set up to pull all the static files on command--something the user can just run, then go to localhost:8000 to see it. I'd also like some way for them to edit the files that were pulled from the git repository.
I'm currently looking into Docker and Vagrant but I'm not sure what's more appropriate for this task—Docker seems like it would be more suited to Linux machines. I know Vagrant supports mapping external folders into the VM, which seems like it'd solve my problem, but I wanted to ask for more suggestions before I start learning Chef/Puppet/etc.
I think both Vagrant and Docker may be used to solve your problem.
Vagrant may be more adequate to share the environment with Windows/mac machines, but integration with Docker in these systems are better day-by-day using tools such as boot2docker.
Docker by contrast requires using a moder Linux Kernel or one of these tools.
If I had to develop the Vagrant option, I would setup a machine with all the dependencies installed in the same machine. To install you can use one of the provisioners available in Vagrant (e.g.: Chef, Puppet). This may be easier if you have previous experience with them and/or if you are not very keen on bash. There are a lot of examples you can check to see how you can do it, such as https://github.com/r8/vagrant-lamp
Using Docker is also a very good option. Answering your question, you can share any local folder of the host machine with a container (using the docker run option -v or --volume). In this case I would run each of the services you want to provide (i.e.: php server, MySQL, Apache..) as independent containers, and linking them using the docker run option --link. Programming your Dockerfiles to build this containers may result more difficult than if you were using Chef or Puppet (although you could use them to build the containers, the integration is not as good as with Vagrant). But with Docker you can take advance of all the apps ready to use you have available in the Docker Hub. Also I would recommend you a docker tool called fig (www.fig.sh) that let running a container cluster linking and configuring the services easily, and it's allow to manage all the containers in a very comfortable way. Again you can find very illustrative examples of this user case over the internet, as for example https://github.com/kasperisager/phpstack
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I am trying to upgrade my web server. I have created a brand new instance of a latest generation virtual server on RackSpace that uses an SSD. On this brand new instance, I installed the following:
Google Chrome
FileZilla FTP Client
I then connected to a FileZilla FTP Server on a different server, which is hosting 2 image files that I am using to test. I then downloaded the 2 image files, which FileZilla reports as "successfully transferred". However, both of the image files are truncated! What could possibly be causing this?
A few things to note:
This only happens on the new instance if it is using an SSD. If I create an identical instance without the SSD (using SATA instead), the error does not occur.
On the server which is transferring the files, the files are also reported as having been transferred successfully. This server has been used as an FTP server for quite some time without any issues.
If I set up the new SSD instance as an FTP server and upload a bunch of files to it, some of them randomly get truncated by 2-10KB. Out of a ~150MB upload, I may end up with 150-200KB missing. If I transfer them again, a different subset of files gets truncated.
If I throttle the transfer speed on the FTP server to 100KB/s, the 2 image files transfer successfully without getting truncated. If I throttle the transfer speed to 500KB/s, the image files get truncated the same way as if there was no throttling.
Any ideas on how this could be happening?
Update: It is not related to FileZilla. Here is the same issue using ftp on the command line:
The solution is documented here: http://www.rackspace.com/knowledge_center/article/disabling-tcp-offloading-in-windows-server-2012
That article is for Windows Server 2012. In my case, I was using Windows Server 2008. To get to the network adapter properties, go to
Right click on Computer --> Properties
Device Manager
Open up network adapters drop down and right click --> Properties
Go to Advanced tab
Disable everything except UDP Checksum Offload.
Important note: If only some of the options are disabled, you will notice a massive performance degradation. Performance will go back up to normal levels after you have disabled all of the necessary options.
The reason it says that the transfer is complete is because closing the socket is- unfortunately- how FTP defines a completed transfer. (It opens up a data connection and sends the data. Closing the connection means the file is completely sent.)
For some reason, it seems like the connection is prematurely closing.
Personally, to me this does sound really bizarre and it might be a driver problem or hardware problem, but I would try:
1. Try Passive mode FTP. The command line client uses PORT mode by default. PASV is more firewall friendly.
2. Try disabling all software firewalls (like Windows Firewall) and retrying.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
At Home we have a lot of media Content on three computers. I was thinking of centralizing them on one single machine that runs a web server that hosts all the content, and when we fetch a URL like : http://localhost/music it will show a HTML file that lists all the music files on the computer, and when we click on a file, it gets downloaded.
I don't know what to use so as to make this thing, I think Apache as a server and PHP as a language..., any suggestions?
Install FreeNAS on your machine
http://www.freenas.org/
You don't even need to go that far. Just install apache on a computer and have your music folder be a web directory. You should be able to navigate to the page like so: http://whateverip/music/ and apache will serve up that html page like you are suggesting.
If using windows, XAMPP is a pretty good solution. You don't really need the PHP part, because all you need on the server after installing a WAMP solution, is to place all your content in the folder that the server uses.
If using Linux, definately install LAMP, there is an easy install of it whether using Ubuntu or other Linux distrubtions.
If using a MAC, there are similar solutions as LAMP. Some are hassle-free.
In any case, all of these use Apache as the server; that is the way to go. It's not as hard as you think, just install the software, and put your files in the appropriate directory, then from each of the other machines, to find the files you would probably type this into your address bar:
http://192.168.1.101/
Which of course is the address of the web server on your Local area network. It is very likely an address that is very similar to that, maybe ending with two different last two numbers.
I would second the freenas route if what you want to store is valuable to you in terms of time or money or electricity. It makes a great headless fileserver with web GUI, supports ZFS filesystem (similar to software RAID5 so you can lose a disk and not lose your data). More valuable to me it also supports replication to a duplicate server.
It can run on very low power hardware using freeBSD as the OS. I measured one of my boxes and it uses about 45 Watts. The OS loads from a USB stick so all your disks become data disks. It holds the system in RAM so your USB does not get any writes to it so the USB lasts a very long time. It will serve CIFS for your windows boxes, AFP for your macs, and NFS for your linux systems. Plug-ins allow for such things as DLNA media servers. I have had three boxes running stable with no reboots for over a year with 6x 3TB drives per box.
A typical hardware setup might be ASUS C60M1-I AMD Fusion board (combined CPU/VGA/NICC/6SATA)for about 75 bucks, 16GB RAM, PSU, USB stick - and voila - add 3 hard drives and you have a low-power RAID fileserver. Get two and you can replicate one server to another in a different physical location.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Trying to setup an SSH server on Windows Server 2003. What are some good ones? Preferably open source. I plan on using WinSCP as a client so a server which supports the advanced features implemented by that client would be great.
I've been using Bitvise SSH Server and it's really great. From install to administration it does it all through a GUI so you won't be putting together a sshd_config file. Plus if you use their client, Tunnelier, you get some bonus features (like mapping shares, port forwarding setup up server side, etc.) If you don't use their client it will still work with the Open Source SSH clients.
It's not Open Source and it costs $39.95, but I think it's worth it.
UPDATE 2009-05-21 11:10: The pricing has changed. The current price is $99.95 per install for commercial, but now free for non-commercial/personal use. Here is the current pricing.
I agree that cygwin/OpenSSH is the best choice, but its setup can be involved to say the least. Here is a document to get you started though: Installing OpenSSH
I've been using Bitvise SSH Server for a number of years. It is a wonderful product and it is easy to setup and maintain. It gives you great control over how users connect to the server with support for security groups.
copssh - OpenSSH for Windows
http://www.itefix.no/i2/copssh
Packages essential Cygwin binaries.
OpenSSH is a contender. Looks like it hasn't been updated in a while though.
It's the de facto choice in my opinion. And yes, running under Cygwin is really the nicest method.
VanDyke VShell is the best Windows SSH Server I've ever worked with. It is kind of expensive though ($250). If you want a free solution, freeSSHd works okay. The CYGWIN solution is always an option, I've found, however, that it is a lot of work & overhead just to get SSH.
You can run OpenSSH on Cygwin, and even install it as a Windows service.
I once used it this way to easily add backups of a Unix system - it would rsync a bunch of files onto the Windows server, and the Windows server had full tape backups.