I have a third-party WP plugin which, although no longer updated, still works fine - and for which I've not been able to find an alternative.
It's 'Zajax' - which ajax-loads internal pages... thus enabling a streaming-radio audio-player to be fixed to the viewport-base, with continuous play throughout page-changes.
However, it appears to require absolute urls - on root-relative urls it reloads the whole page (and thus stops continuous-play).
This is a hindrance, because I normally use root-relative urls - and hence sometimes forget to ensure that all internal urls are absolute rather than root-relative.
I want to modify, so that it'll work with root-relative urls - but don't know enough to do this.
Actually using root-relative URL-s is not good idea, but if it is comfortable for you, then use small jQuery snippet which may help you with the problem.
jQuery("a").each(function(){
if (jQuery(this).attr("href").indexOf("http")==-1){
jQuery(this).attr("href","https://yourwebsiteurl.com/"+jQuery(this).attr("href"));
}
});
You can put this code to footer area of your website, it will detect root-related links and convert them to normal links. (without changing anything at your backend, of course)
Related
I am trying to decode and encode Joomla urls but Joomla doesn't seem to have a consistent API for that (how it looks). The main problem comes in when another SEO plugin is installed and the operation is performed as background process (ie: not whilst rendering in a browser through Joomla).
The other big problem is that users copy and paste SEO urls of the own site directly into the content.
Does anyone knows a solution for this ? Supporting all sorts of SEO plugins individually is a total no-go and rather impossible.
I actually thought its the Job of the CMS to guarantee on a API level that SEO urls can be decoded and encoded without knowing the plugins, but no. I also had a look in some plugins and indeed, plugins do
handle code for other plugins whilst it shouldn't be, coz.
Well,
thanks
You can't. JRoute won't work reliably in the administrator, I even tried hacking it, it's a no-go.
Moveover sh404 (one of the leading SEF extensions) does a curl call to the frontend in order to get the paths right. You can find in their code a commented attempt to route in the backend.
Are you are trying to parse content when it's saved, find SEF urls and replace with their non-sef equivalents? If you create a simple component to handle this in the frontend (just get what you need from xmap), then you can query the frontend from the backend with curl/wget and possibly achieve this with a decent rate of success: but I wouldn't expect this to work 100% (sometimes parameters are added by components, or the order of parameters is different from call to call, and the router.php in extensions can be very fragile or even plain wrong).
I'd like to speed up my site's loading time in part by ensuring all CSS/JS is being cached by the browser, as recommend by Google's PageSpeed tool. But I'd like to ensure that visitors have the latest CSS/JS files, if they are updated and the cache now contains old code.
From my research so far, appending something like "?459454" to the end of the CSS/JS url is popular. But wouldn't that force the visitor's browser to re-download the CSS/JS file every time?
Is there a way to set the files to be cached by the browser, but ensure the browser knows about updated versions of the cached files?
If you're using Apache, you can use mod_pagespeed (mentioned earlier by symcbean) to do this automatically.
It would work best if you also use the ModPagespeedLoadFromFile directive since that will create a new URL as soon as it detects that the resource has changed on disk, however it will work fine without that (it will use the cache expiry time returned when it fetches the resource to rewrite it).
If you're using nginx, you could use ngx_pagespeed.
If you're using IIS, you could use IISpeed, which is not a Google product and I don't know it's full feature set.
Version numbers will work, but you can also append a hash of the file to the filename with your web framework or asset build script:
<script src="script-5054a101c8b164cbfa570d97fe23cc0d.js"></script>
That way, once your HTML changes to reflect this new version, browsers will just download and cache the updated version of your script.
As you say, append a query string to the URL of the asset, but only change it if the content is different, or change it when you deploy a new version.
appending something like "?459454" to the end of the CSS/JS url is popular. But wouldn't that force the visitor's browser to re-download the CSS/JS file every time?
No it won't force them to download each time, however there are a lot of intermediate proxies out there which ignore query strings on cacheable content - hence many tools (including mod_pagespeed which does automatic url rewriting based on file conents, and content merging on the fly along with lots of other cool tricks) move the version information into the path / filename.
If you've only got .htaccess type access then you can strip the version information out to map direct to a file, or use a scripted 404 redirector (but this is probably only a good idea if you're behind a caching reverse proxy).
I'm just getting started with ImageResizer and I'm stuck on what seem like totally basic questions:
I have an uploader that I use to put images into a directory that's not directly accessible over HTTP. (If I just put a image at, say, /images/myimage.jpg, then anyone could access it by just asking for it, whereas I want to limit access via thumbnails, watermarks, etc.). So I want to put it at /offlimits/myimage.jpg, but be able to serve it up at /public/images/myimage.jpg.
I don't really want to dump all the images in the same offlimits folder, because putting lots of files in one folder makes Windows unhappy. But I don't want to expose the details of that subdirectory structure either, so where do I put the mapping between the public facing url and the actual image location?
Most generally, I don't necessarily want an image extension at all, so I'd like to say /public/image_id?width=100... and have this map to /offlimits/sub1/sub2/sub3/image_id.jpg.
Can anyone advise about how to set this up?
Three part questions are generally frowned upon here at SO, but I'll bite anyhow :)
If you're allowing access to images based on authentication, then you need to use ASP.NET's URL Authorization feature. ImageResizer supports URL Authorization rules. If you just don't want the source files available, and want to force them resized or watermarked, read the docs on how to implement arbitrary rules like this.
You can rewrite image paths to your heart's content with Config.Current.Rewrite, which works just like the PostRewrite event mentioned earlier. Just remember you'll have to keep it all straight in your head later.
Image extensions are good things. Don't fight them. They let the server figure out the right mime-type to send and help errant browsers recover from related bugs. They prevent issues on several platforms and make the Save As dialog work. They significantly improve server efficiency as well, since handling logic doesn't have wait as long. This is particularly relevant because of the design of the IIS/ASP.NET modules system.
I am starting to set up a personal website, and I would like it's layout to look something like
-------------------------------
- Page Header & Menus Go Here -
-------------------------------
- Main Contents -
-------------------------------
- Footers -
-------------------------------
The main question is that I would like it to be a single-page interface in which the main contents are loaded and displayed with a combination of AJAX and jQuery to produce a nice effect. However, I would, of course, like to have the contents bookmark-enabled and indexed by search engines. I have skimmed throught the Single Page Interface Manifesto which explains some nice ways of achieving this, but I wouldn't really like to have my URLs like
http://www.mysite.com/index.php#!section=section1
http://www.mysite.com/index.php#!section=section2
I would, of course, like to re-write them as
http://www.mysite.com/section1
http://www.mysite.com/section2
My questions are this whether this approach is correct/doable and if AJAX URLs are compatible with URL rewriting. What URLS would be indexed by, say, Google anyway?
If you want your page to work without reloading and update at the same time the page's URL, the only way to archieve this is by changing the hash in the URL (location.hash = 'whatever').
URL rewriting cannot be used since the hash is not sent to the server, it's only available in the browser's scope.
Check Facebook or Twitter URLs. They are prettier than #!section=section1 but still need the hash.
Cheers.
If you want to load different content/tabs/some content of page without reloading browser,
Now It is possible with pjax..
you can use something like http://padrino-pjax.heroku.com/
you can try it, go to the link and click on any of links home,dinosaurs,aliens
and you will see It will change url and some content without reloading full page
It is achieved using ajax+push/pop of url in browser
I'm looking for a solution myself for a similar problem (I have a client site with an AJAXed wordpress theme, and these dreadful #! stuff on the URL prevent all the Social sharing plugins I have tried so far, from working correctly).
Apparently, there is a solution (with some drawbacks ofc..). I found about it here: http://moz.com/blog/create-crawlable-link-friendly-ajax-websites-using-pushstate
I know it's like two years since you've asked, but it could be helpful for someone else, or you may wanna check it out just for the sake of the curiosity itself! :-)
Not long ago I came across this website: http://www.danasoft.com/
This websites provides dynamically updating signatures which are pretty cool in my opinion.
There is just one thing that I don't get and would really like to know how to do.
Here's a direct link to an image on the website: http://www.danasoft.com/vipersig.jpg Try refreshing. Notice it changes? How do I achieve that? How do I have a direct link to a file like www.mypage.com/thing.jpeg output different images each time?
Basically, the URL is not actually retrieving the file directly each time, but rather the server is intercepting that URL and serving a (possibly random) image from a larger set of images. Depending on whether the server is running Apache, IIS, etc, the implementation could vary... This could also probably be achieved with the MVC routing engine by defining a custom route handler for URLs ending in '.jpg', but I'm not actually sure.
EDIT:
See this discussion for more detail on the MVC implementation.