Include source code from different directory - windows

I have three different domains all on the same server and I want to run the code on all three domains from one source on the same server, but not sure the best way.
Here's what I have:
domain01.com
domain02.com
domain03.com
domain04.com/sourcecode
I want domain01-03 to run the code inside domain04.com/sourcecode so the user can go to their domain and not have to go to domain04.com to see their site. I want to keep all the code inside domain04.com because I don't want to have to put the code inside each domain every time I make a code change.
For whatever reason I can't get my head around the best way to do this -- and want to do it right.
Any advice?
Thanks!

All you need to do is create a mapping on the first three sites to the appropriate directory in the fourth site, eg map /domain04 to /full/path/to/domain04/sourcecode, then refererence its CFML resources via /domain04 in CFC and include paths. The inference here is the code does need to be accessible via the file system for all sites concerned.
Note that if you also want to server non-CFML files via HTTP (eg: images, css, js), then you will also need a web server virtual directory along the same lines.
None of this requires a framework, it's standard CF / web server functionality.

Are you using a framework? One like ColdBox could make this trivial if your code is written modularly. (Disclaimer, I am affiliated with ColdBox)
If not, it really depends on what the code is. CFCs can be mapped anywhere via ColdFusion mappings. Even .cfm files can be included as long as the file systems are visible. If you're wanting to basically have complete copy of a site in another web root without duplication, I would first consider using a shared source control repo and a build process that checks it out in the appropriate places, and secondly a good old, symlink will also work .

Related

Is it possible in Alfresco avoid the resource url random generated?

I understand that in production it is a good practice to use random generated filename to force the browser to download the file and not use cache (cache busting).
For example Alfresco generate files like this:
http://localhost:8080/share/res/js/surf/3f1b42d3f51529dabe2af76fecf9ebcd.js
And If I make a change in my libraries it will change the URL.
My question: It is possible to avoid this behavior in development?
I will like a path like:
http://localhost:8080/share/res/js/surf/surf.js
The advantage of that is that I can put break points, make changes in the code and then reload the page.
Right now I have to each time I reload the page, I have to find one more time the part of the code I am working on and then putting the breakpoint reload one more time.

Populate file list with previously uploaded files

Using the jQuery wrapped version of Fineuploader v3.3.
Is it possible to populate the file list with files already in the upload folder?
I think "_addToList(id, name)" should do the trick, but I can't get it to work. Any ideas?
Seems that they are currently working on this feature:
https://github.com/Widen/fine-uploader/issues/784
So, this will be available soon.
This is not a behavior that Fine Uploader currently supports. Fine Uploader only displays files that users have submitted to the uploader since the current uploader instance was created. It doesn't try to be an all-in-one web application. You could probably add your own item to the list/UI via javascript. That probably wouldn't be terribly difficult, but seems like an odd thing to do.
If you'd like to discuss your specific use case more, please open up a feature request in the Github issue tracker.
Generally, client side code cannot add stored or hard-coded path based file names for use in any type of POST or upload operation. Obviously this is a security measure, you can imagine if a malicious web page could add to a generic POST operation some type of baked in file name. So from what I understand, only the user can specify path based file names, via a file browser for the session that it is included in. This applies to HTML/JavaScript/jQuery but am unsure if Flash/Silverlight based solutions would also be limited. I think a Java based uploader would be free of this. But you are just moving closer and closer to installed software.

Getting images from a URL to a directory

Well, straight to the point, I want to put a URL, and get all the images inside this URL, for example
www.blablabla.com/images
in this images folder I want to get all the images... I already know how to get a image from an specific URL, but I dunno how to get all of the without having to go straight to the exactly path, is there a way to get a list of all the items inside a URL path or something like that?
Well, basically, this can't be done. Well, not under normal circumstances anyway. The problem is that you don't know what files are in that directory.
...unless the server has "directory listing" on. This is considered a security vulnerability, so the chance this is the case isn't too high. (The idea is that you are exposing details about your server that you don't have to, and while it is no problem on its own, it might make things that can be a security problem known to the world.)
This means that if the server is yours, you can turn directory listing on, or that when the server happens to have it turned on, you can visit the url (www.blablabla.com/images) and see a listing of all the files in that directory. This doesn't always look exactly the same, but in general the common thing is that you will get an html page with links to all the files in the directory. As such, all you would need to do is retrieve the page and parse the links, ending up with the urls to the images you want.
If the server is yours, I would recommend at least looking into any other options you might have. One such option could be to make a script that provides all the urls instead of relying on directory listing. This does not have some of the more unfortunate implications that directory listing has (like showing non-images that happen to be in the same directory) and can be more flexible.
Another way to do this might be to use a protocol different from HTTP like FTP, SFTP or SCP. These protocols do not have the same flexibility as a script, but they do have even more safety as they easily allow you to restrict access to both the directory listing and your images to only people with correct login details (or private keys). (Of course, if such a protocol is available for your use and it's not your own server, you could use them as well.)

How to manage URLs in CodeIgniter so they can be updated in a single place

I believe Smarty templates has functionality built in that allows you to manage your site URLs from a config file so if something gets moved, you only have to update the URL in one place. Does this sort of functionality exist in CodeIgniter? If not, any pointers or examples on how/where to add it?
For example:
Instead of hard-coding the link it would be: Settings
But where would you want to set $links so that it was available everywhere? Or is it really best to just hard code them?
Take a look at the config class. It allows you to make custom config files.
It's not entirely made for URL's but you sure can use them.
The base url should be basically right at the start of /app/config/config.php, where app is the name of your codeigniter application folder. You access it through calls to the base_url() function.
Yes, it's called Routes, configuration located at config/routes.php. Documentation
If you ask about the rendered html of the links, then your best bet would be using site_url() in conjunction with constants, for example site_url(URL_SETTINGS);, there is no built in functionality for that, but I can say I don't think that is necessary as it would be used too rarely, but it would influence performance every single load.

How can I set up custom ImageResizer urls?

I'm just getting started with ImageResizer and I'm stuck on what seem like totally basic questions:
I have an uploader that I use to put images into a directory that's not directly accessible over HTTP. (If I just put a image at, say, /images/myimage.jpg, then anyone could access it by just asking for it, whereas I want to limit access via thumbnails, watermarks, etc.). So I want to put it at /offlimits/myimage.jpg, but be able to serve it up at /public/images/myimage.jpg.
I don't really want to dump all the images in the same offlimits folder, because putting lots of files in one folder makes Windows unhappy. But I don't want to expose the details of that subdirectory structure either, so where do I put the mapping between the public facing url and the actual image location?
Most generally, I don't necessarily want an image extension at all, so I'd like to say /public/image_id?width=100... and have this map to /offlimits/sub1/sub2/sub3/image_id.jpg.
Can anyone advise about how to set this up?
Three part questions are generally frowned upon here at SO, but I'll bite anyhow :)
If you're allowing access to images based on authentication, then you need to use ASP.NET's URL Authorization feature. ImageResizer supports URL Authorization rules. If you just don't want the source files available, and want to force them resized or watermarked, read the docs on how to implement arbitrary rules like this.
You can rewrite image paths to your heart's content with Config.Current.Rewrite, which works just like the PostRewrite event mentioned earlier. Just remember you'll have to keep it all straight in your head later.
Image extensions are good things. Don't fight them. They let the server figure out the right mime-type to send and help errant browsers recover from related bugs. They prevent issues on several platforms and make the Save As dialog work. They significantly improve server efficiency as well, since handling logic doesn't have wait as long. This is particularly relevant because of the design of the IIS/ASP.NET modules system.

Resources