Hosting images on Google Code - hosting

I want to put several screenshots of a project I'm working on, in the wiki and project pages for the project on Google Code. How do I host or attach the image files? If this isn't possible, where should I put them?
Update0
This question has spawned another: Get certificate fingerprint of HTTPS server from command line?

Yes, you can put these files in the same repository that holds the wiki - i.e. http://code.google.com/p/pydlnadms/source/checkout?repo=wiki.
These files will be served under http://wiki.pydlnadms.googlecode.com/hg/.

Related

VScode: how to setup for local edit and ftp-deplyment

I used to use Dreamweaver. I've a huge Classic ASP website. I edit the files on my local system, and when done, I can upload the file(s) via ftp to the remote webserver. Now, I try to switch to VSCode. I've installed ftp-simple, ftp-sync and deploy. But can't find the set-up to get a Dreamweaver like behaviour. Eg, I have to locate for each file I want to upload/deploy, the exact location in the remote file tree.
I really feel like deploy deserves more attention. I spent the past 4 days or so to find an extension that does just that. Auto-upload to an ftp-folder from a local folder. I wanted to make git work for my website, but couldn't get that to work on the server with ftp-simple or ftp-sync because those extensions only download the opened files or open in a different temporary folder each time. I set up deploy now and got exactly what I wanted thanks to your tiny comment, thank you!
(I'm sorry if this post is too old to comment on, but I browsed Stack overflow for days to find this, so I thought it might help others in the future to point this out.)
it sounds like your just missing your mapping configuration. Most text editor FTP packages include a configuration file where you specify the server, your credentials, and the root folder of your ftp server. Have you specified this?

Is there a way to recover an entire website from the wayback machine?

My website files got corrupted and lost all the backup files somehow. Can any one please suggest the process to download entire site.
Its a simple html site. Once after downloading how can I host it ?
Please help
You can't use a regular crawler because the contents served have the original links, so you get out of the first page immediately when you're crawling it if you don't rewrite the links: in the browser they are rewritten with a client-side script to point back to the Wayback Machine.
If it's simple html, like you mentioned, and very small you might want to save the pages manually or even copy the contents by hand to a new website structure. If it's not small, then try the tools mentioned in the answers to this similar question in superuser: https://superuser.com/questions/828907/how-to-download-a-website-from-the-archive-org-wayback-machine
After downloading it you may have to check the structure of the files downloaded for links that may have been incorrectly rewritten or for missing files. The links that point to files that belong to the website should be local links and not external ones. Then you can host it again on a web hosting service of your preference.

I have completed my site in dreamweaver, Do i just upload my root folder to my host now?

So as the title says, I have completed my site for my first client in Dreamweaver ( still studying in college ), Do i just upload it to their hosting site of choice and will all the links work? I know how to upload just curious about how the links like images and my root folder will stay in tact.
Yes, Generally, they will. You can try it for yourself and see. As long as your refering to the images relative to your file.
Upload all your files to your live server's 'htdocs' or 'public-html' folder. All your links should work fine, if done properly. As far as you move all your folders (images, css, etc) to the live server everything will work.
Dreamweaver doesn't require any special thing or configuration before deploying, you can even deploy directly from Dreamweaver to be super sure nothing breaks while deploying to the production server.

Best way to share a common CSS and Scripts between web applications?

Using: Dreamweaver CS6, ColdFusion 10, CFBuilder 3 (soon)
I'm currently developing three seperate web projects but am using the same set of resources for each project:
Same CSS reset and initialisation styles e.g grid layouts
Same JQuery initialisation files
At the moment each web project has a copy of the same files in its web root. I'd like to have a place that all the sites link to get these common shared resources. I can only think of two ways to do this, both similar:
Decide on a 'master' site and place all the CSS and JS files in that site only.
Make a new site called 'shared or common' and put all the common stuff in there. but this new site still has to reside on some domain. So I still need to choose a 'master' project.
Both the above points are somewhat difficult to do because there is no 'master' site. None of the sites are linked in any way to each other.
Say I've decided on a master site then I could then link to those files using a fully qualified URL. But this means that when I am in my development environment I won't be able to see these files and use any kind of IDE introspection to see whats in the files when I make changes (e.g. like intellisense detects whats in the files and gives you hints).
Overall whats the best way to share these resources for both development and production environments?
Update: After the answers I realised I need a virtual directory in IIS and also a virtual folder in my Windows 7 local folders where my web project files are kept.
You need to use web server virtual directories (for IIS) or alias (for Apache).
Simply create an alias/virtual directory called 'scripts' (or whatever you want) in the web root of each site (you can do this in your development environment too, if you are using IIS or Apache) and then simply use
<script src="/scripts/my.js" ></script>
I'll venture an answer.
We have shared assets across HTTP to sister sites and some unrelated sites. But the primary site collapsed and it created a single point of failure that cascaded in insane ways across our other sites (support calls were also insane until we got the primary site fixed).
Nightmare. Never again...
What I suggest is to create an Amazon AWS S3 bucket and host your assets there.
Provided you don't exceed 20,000 requests or 15GB of data per month it is free.
There are lots of setup examples on the internet.
AND if I can set this up and get it working like a charm, then you should have no problem either.
Just a consideration...Good luck.
We do this without sites. Instead we use server mappings. In the CFAdmin tool, we have mapped logical path "/" to a directory path. That directory path has folders for js, images, css, and templates. Then if we want to use a javascript file, it's simply:
<script src = "/js/theFileIWant.js">

ClickOnce Error "different computed hash than specified in manifest" when transferring published files

I am in an interesting situation where I maintain the code for a program that is used and distributed primarily by our sister company. We are ready to distribute the program to all of the 3rd party users and since it is technically our sister companies program, we want to host it on their website. (in the interest of anonimity, I'll use 'program' everywhere instead of the actual application name, and 'www.SisterCompany.com' instead of their actual URL.)
So I get everything ready to go, setup the Publish setting to check for updates at program start, the minimum required version, and I set the Insallation Folder URL and Update Location to "http://www.SisterCompany.com/apps/program/", with the actual Publishing Folder Location as "C:\LocalProjects\Program\Publish\". Everything else is pretty standard.
After publish, I confirm that everything installs and works correctly when running directly from the publish location on my C: drive. So I put everything on our FTP server, and the guy at our sister company pulls it down and places everything in the '/apps/program/' directory on their webserver.
This is where it goes bad. When I try to install it from their site, I get the - File, Program.exe.config, has a different computed hash than specified in manifest. Error. I tested it a bit, and I even get that error trying to install from any network location on our network other than my local C: drive.
After doing the initial publish in visual studio, I have changed no files (which is the answer/reason I've found by doing some searching about this error).
What could be causing this? Is it because I set the Installation Folder URL to a location that it isn't initially published too?
Let me know if any additional info is needed.
Thanks.
After bashing my head against this all weekend, I have finally found the answer. After unsigning the project and removing the hash on the offending file (an xml file), I got the program to install, but it was giving me 'Windows Side by Side' Errors. I drilled down into the App Cache were the file was, and instead of a config .xml file, it was one of the HTML files from the website the clickonce installer was hosted on. Turns out that the web server didn't seem to like serving up an .XML (or .mdb it turns out) file.
This MSDN article ended up giving me the final solution:
I had to make sure that the 'Use ".deploy" file extension' was selected so that the web server wouldn't mangle files with extensions it didn't like.
I couldn't figure out why that one file's hash would be different. Turns out it wasn't even the same file at all.
It is possible that one of the FTP transfers is happening in text mode, rather than binary?
For me the problem was that .config transformations were done after generating manifest.
To anyone else who's still having trouble, five years later:
The first problem was configuring the MIME type, which on nginx (/etc/nginx/mime.types) should look like this:
application/x-ms-manifest application
See Click Once Server and Client Configuration.
The weirder problem to me was that I was using git to handle the push to the server, i.e.
git remote add live ssh://user#mybox/path/to/publish
git commit -am "committing...";git push live master
Works great for most things, but it was probably being registered as a "change," which prevented the app from installing locally. Once I started using scp instead:
scp -r * user#mybox/path/to/dir/
It worked without a hitch.
It is unfortunate that there is not a lot of helpful information out there about this.

Resources