I`m using MediaWiki v1.19.1.
My wiki works well when I use it locally.
But when I access it over the network (from another computer, or a different IP),
it displays the text only. There are no images.
It seems like a classic skin but it`s not.
The reason is that there is no layout on my wiki (other public wiki pages show ok).
My wiki uses the monobook skin now, but I can see only the text on the page.
I have changed the permission to 777, including on all directories (/var/www/kj/*),
but still no images.
Help me, please...
I got the same issue some time ago and the following worked fine for me.
The issue might be related to the LocalSettings.php file and the general setting $wgServer.
The following link can provide you more details : Manual of $wgServer
Since 1.18 MediaWiki has also supported setting $wgServer to a protocol-relative URL.
eg, //www.mediawiki.org
This is used for supporting both HTTP and HTTPS with the same caches by using links that work under both protocols.
So try removing localhost and provide your URL; eg ; $wgServer = "//mywebsite.com";
There's not enough information to give a definite answer, however general recommendations for such situations are:
If you're using any Apache rewrite rules (for example, to make URLs prettier), try disabling them.
Especially if you're using the http://example.com/Page_title style URLs, you should know that they're unsupported by the developers and require serious MediaWiki/Apache skills (and even then they will likely introduce subtle bugs).
Install Firebug and check what's the HTTP error for your images: is it because access is denied (HTTP 403) or the webserver doesn't see them at all (HTTP 404) - this should give you an idea what's going on.
Related
I've got a web app which heavily uses AngularJS / AJAX and I'd like it to be crawlable by Google and other search engines. My understanding is that I need to do something special to make it work, as described here: https://developers.google.com/webmasters/ajax-crawling
Unfortunately, that looks quite nasty and I'd rather not introduce the hash tags. What I'd like to do is to serve a static page to Googlebot (based on the User-Agent), either directly or by sending it a 302 redirect. That way, the web app can be the same, and the whole Googlebot workaround is nicely isolated until it is no longer necessary.
My worry is that Google may mistakenly assume that I'm trying to trick Googlebot, while my goal is to help it. What do you guys think about this approach, and what would you recommend?
Recently I come upon this excellent post from yearofmoo, explaining in details how to make your Angular app SEO friendly. In essence, when bots see an uri with a hash tag they will know it's an ajaxed page and will try to reach the same uri by replacing '#!' in your uri with '?_escaped_fragment_='. This alternative uri instructs bots that they should expect to find a definitive static version of the page they were accessing.
Of course, to achieve this you'd have to introduce hash tags into your uris. I don't see why are you trying to avoid them. Isn't gmail using hash tags?
Yeah unfortunately, if you want to be indexed - you have to adhere to the scheme :( If your running a ruby app - there's a gem that implements the crawling scheme for any rack app....
gem install google_ajax_crawler
writeup of how to use it is at http://thecodeabode.blogspot.com.au/2013/03/backbonejs-and-seo-google-ajax-crawling.html, source code at https://github.com/benkitzelman/google-ajax-crawler
Have a look at these links and it will give you a good direction:
Set up your own Prerender service using Prerender.io open source code:
https://prerender.io/
Use a different existing service such as BromBone, Seo.js or SEO4AJAX:
http://www.brombone.com/
http://getseojs.com/
http://www.seo4ajax.com/
Create your own service for rendering and serving snapshots to search engines. Read this article. It will give you the big picture:
http://scotch.io/tutorials/javascript/angularjs-seo-with-prerender-io
As of May 2014 GoogleBot now executes JavaScript. Check WebmasterTools to see how Google sees your site.
http://googlewebmastercentral.blogspot.no/2014/05/understanding-web-pages-better.html
Edit: Note that this does not mean other crawlers (Bing, Facebook, etc.) will execute Javascript. You may still need to take additional steps to ensure that these crawlers can see your site.
I've been devolving several Magento modules on a Mac's local Apache server. Lately, I've moved the modules to a new Magento install on a new server. My problem is that all but one is 404-ing when I'm trying to load their admin pages. I can't find any reason why this one module works and the others don't even try to load pages. Most if the code is very similar from one to the next. Also, the 404 pages are not helpful and there are no exception or log entries to help me. These all work on the old server, and although some of the code has bad/old links that need to be fixed, and those generate errors I see no reason they aren't trying to load pages and generating errors.
I think the configs work, because I get the admin menu. Since the layouts have nothing concerning the front part of URL's, I see no reason for the problem to be them, either. I could post code, but I have no idea what to post that could be causing this.
I would greatly appreciate any insight that could be causing this.
When i devolve modules on windows and send to Linux, sometimes forget and make some thing on capital letter and others on lowcase, windows isn't case sensitive on this case, but linux is. maybe is the same on Macs / Linux
I'm using FBML and everything was working before, then I noticed that when you click like the comment box would not show up anymore. I checked out the dev site and saw that the way it's done now has changed.
I updated my code but it still doesn't show the comments box but the actual liking action works fine.
Here is a link: http://fez.nu/Oniir
EDIT 2: I'm using XFBML, I don't know if that makes a difference. I've heard that FBML is deprecated
EDIT 3: I browse with https on Facebook. I turned it off and the comment boxes show up on my site. So that problem is solved but how do I make it work with users that are using secure browsing on Facebook when my site is not?
You can use an HTML inline frame.
<iframe src="https://mytab.example.com/tabs/"></iframe>
This question has produced an accepted solution to what appears to be your problem.
Abstract
You can avoid SSL warnings for domains that support SSL by not being
specific about the transport protocol. e.g. instead of including
http:// or https:// use //
<!-- Instead of this: -->
<iframe src="http://www.facebook.com/plugins/like.php?params"></iframe>
<!-- Do this: -->
<iframe src="//www.facebook.com/plugins/like.php?params"></iframe>
Security considerations
Please note that there are some security considerations to this approach. I recommend you read the below articles and questions.
Are there security issues with embedding an https iframe on an http page (SE question)
HTTP and HTTPS iframe (SO question)
Other considerations when serving mixed (http/https) content (external site)
Edit 1
In the FAQ for the Like Button, it's stated that it will need 400 pixels in width to give the user the option to add a comment. If you don't have 400 pixels available, I think you would have to sacrifice the option to post a comment, or use a popup-window instead.
This problem started last night.
I thought it was my code at first but a quick rollback to code I KNOW (QA Tested) worked manifested the issue.
I went to the source itself, Facebook's OWN Like button generation tool and THEY have the same problem.
Then I picked BOTH a random CNN & Yahoo news article and that is what you call a trifecta. All 3 sites could not properly render the comments UI elements for the like button.
If it walks, quacks and moves its head like a duck... WTF? :))
I am trying to get IIS 7.5 compression working. It sounds so simple by all the blogs, but it isn't working for me. I am using asp.net mvc if that matters. I am testing this locally, and I am using IIS 7.5, not cassini. I have read and tried every article I can find. I have made sure the static and dynamic compression modules are installed. I have tried several articles like this, that talk about setting it up using appcmd. I have tried several articles like this, that configure the settings in the web.config. I have checked the mime types as discussed in articles where js files were only working sometimes (I can't post the link because I am a new user and limited to 2 links). I have created action filters for mvc as discussed in many blogs (again, I can't post the links)
None of the methods are compressing anything. No css, no js, no pages... nothing. I can even step through the code, I see that the action filter is running, I even see it running the following two lines, and yet fiddler shows no compression.
response.AppendHeader("Content-encoding", "gzip");
response.Filter = new GZipStream(response.Filter, CompressionMode.Compress);
I must be having a huge brain fart and missing something really stupid, but I can't figure it out.
Does IIS compression just not run when running locally? Does the IIS compression not work with MVC because of the routing or something?
Any suggestions/tests/info?
Try adding the staticContent element to the web.config.
http://www.iis.net/ConfigReference/system.webServer/staticContent
I'm making a website using Facebook Connect and decided to use Facebook's XFBML tags like "fb:profile-pic" since they are so easy to use.
I haven't been able to make them work no matter how hard I look online but then I noticed that it worked on all the browser's instead of Firefox.
I also realized that even on Facebook's own "The Run Around" sample app they don't work!! You can check it out here: http://www.somethingtoputhere.com/therunaround/index.php
If you log in with Firefox your picture is not shown, but if you use another browser it is shown. This happens with the fb:profile-pic tag or any other tag like fb:name.
I haven't found any information online so I'm asking other people that have worked with this: Are these tags simply not compatible with Firefox ? Do they have outages or something like that ? Has this happened to anyone before ? Any ideas on how to resolve this ?
I guess they do have "outages". I've spent the whole weekend trying to resolve this and now they post they had a problem and have resolved it.
From the Platform Live Status website:
http://developers.facebook.com/live_status.php#msg_497
We are experiencing a possible config
problem with api.connect.facebook.com.
If you are including Connect JS
library through
http://static.ak.connect.facebook.com/js/api_lib/v0.4/FeatureLoader.js.php,
all API requests through JavaScript
would fail. This affects rendering of
XFBML tags (such as fb:name and
fb:profile-pic) as well. While we are
fixing this issue, you can work around
the problem by changing
http://static.ak.connect.facebook.com/js/api_lib/v0.4/FeatureLoader.js.php
to
http://static.ak.facebook.com/js/api_lib/v0.4/FeatureLoader.js.php.
It's also safe to keep url change
permanently because
connect.facebook.com is just an alias
to facebook.com.
I wish they had updated that sooner, now I'm looking for a place to find out about this stuff before I spend days working on something before realizing it's not a problem with my code!
Open up Firefox > Preferences > Privacy and make sure "Accept third party cookies" is checked. This is needed for Facebook Connect to work. Also, when using Connect, make sure all your tags are fully closed, i.e. <fb:profile-pic></fb:profile-pic> and not <fb:profile-pic/>. From the docs:
The user's browser must be set to
accept 3rd Party Cookies in order for
it to stay connected between clicks.
Source: http://wiki.developers.facebook.com/index.php/Logging_In_And_Connecting
FWIW, I wouldn't use "the run around" as a sample app. That thing has been the same since they introduced Connect and is pretty hacky.
do check in connect section under the canvas option.
there should be a link of your physical file.