I have several sites running on Umbraco 7.5.3 on my local ISS, for one of them I get empty section list when accessing back-office.
There are no javascript errors in console.
The issue is that request to
GET /umbraco/backoffice/UmbracoApi/Section/GetSections HTTP/1.1
returns empty array, while other websites returns sections as expected.
I can't find any documentation on this issue, or relevant source code to debug it for.
Could someone explain or give me a hint on what is happening and how that array is filled out.
I've removed cache, cookies, tried incognito, different browsers, reinstalling Umbraco.
In your config folder, check that the /config/applications.config and /config/trees.config files have contents. I have seen it happen occasionally where these files get set to empty for some reason, which causes trees to be empty.
If they're empty, delete them and touch the web.config file to restart the app, and the files should get rebuilt.
Related
So I have been having a war with Google these last couple of days.
I created a sitemap (which is generated dynamicly) and routed to it using reverseproxy in nginx.
Happy with myself i uploaded the url too Google search console.
I got an error right away as seen in the picture. "Sitemap is HTML"
After digging around a while it appeared our pre-renderer had picked up the request and served google a pre-rendered version of the xml file, thus in html.
But even after fixing this, making sure no request for sitemap.xml goes to our pre-renderer google still, gives the same error message.
I have tried removing it and adding it again in search console multiple times on different days, i have tried waiting, I have tired serving it with another name (sitemap2.xml), and I have tried adding an actual xml file instead of the dynamic one. Nothing works!
I have verifed the xml file after i disabled the pre-renderer on multiple other sites and every one gives me an ok.
It's as though it's ignoring my requests to re-check the file.
Sitemap location: https://www.tirex.se/sitemap.xml
Any tips at this point would be much appriciated!
In my windows machine I run a simple fileserver that serves certain files from a folder. I access these files via chrome/firefox browsers
For a certain file format (In my case ".bin" file) the xhr request always stalls with a message saying "pending". But If I rename the file extension to ".cbin" and reload the page on browser again it works.
Why are the browsers preventing a certain file to be loaded ? All of this used to work a month back without issues (ie loading the bin files). I have disabled my antivirus too.
Any help would be invaluable. Thanks
After hours of searching on the web and experimenting I realized that the browsers have recently added rules by which they block downloads of certain files (exe,dmg,zip,gzip,bin) etc on a http connection for security reasons.
Hope this will help someone who faces the same issue.
You can read more about this issue here and here.
Writing an asp.net mvc app and playing with ckeditor (4.7.2) as a newbie. I load ckeditor from the cdn so I have to configure it to be able to load plugins locally, as per the doc:
CKEDITOR.plugins.addExternal('name', '#Url.Content("~/scripts/ckeditor/plugins/name")',
'plugin.js');
But if I do this, the browser tries to load this:
localhost/scripts/ckeditor/plugins/name?t=H7HDplugin.js
Checking in the browser console, I see also that all ckeditor files are loaded that way, with this appended ?t=H7HD query string.
The only workaround I found was to use this form instead:
CKEDITOR.plugins.addExternal('name',
'#Url.Content("~/scripts/ckeditor/plugins/name/plugin.js")', '');
which loads the file correctly:
localhost/scripts/ckeditor/plugins/name/plugin.js?t=H7HD
Is this a bug? Should the first form build the URL in the correct order? Or is there something I missed?
Update: realized that the doc puts a final slash to the path. If I add it, this also fixes the issue, keeping the query string at the end. But this does not explain why this query string exists and why ckeditor does not check for this ending slash.
And this brings me to a secondary question. During development, modifying files in Visual Studio is enough to make Chrome reload them without any action from me. But in the case of plugins loaded by ckeditor, Chrome keeps the old version. I have to clear the cache of the browser, each time I want to be sure I have the latest version. Is there any way to fix this? I think this is a related question because I first thought this ?t=... was a mechanism to reload the files by bypassing the cache, except that here, the value behing 't' does not change.
The second parameter in the addExternal() method as per documentation is:
path : String
The path of the folder containing the resource.
If the path is /scripts/ckeditor/plugins/name?t=H7HD then the last parameter (plugin file, plugin.js) is concatenated with it.
as per the documentation, this is a timestamp property:
https://ckeditor.com/docs/ckeditor4/latest/api/CKEDITOR.html#property-timestamp
I'm developing a game in js/php. When I first uploaded my project, it contained a file named "index.html" with nonsensical content (only the word "bla" and a facebook-like-button). I later deleted that "index.html" so that requests to the domain would hit my "index.php" instead (wich contains the actual game).
This happend over a week ago, and i still see people (friends i asked to test the game) getting this dumb "index.html" shown when they open the site in their browsers. I also see this happening to roughly 1/3rd of the browsers when requesting screenshots via browserstack.com or browsershots.org.
I'm assiming the index.html is still cached by cloudcontroles Varnish-cache, but i can't find any possibility to clear this cache for my site. How can i do this or what can i do to get rid of this cached version?
For anyone who wants to test this live: http://dotgame2.cloudcontrolled.com/ (note that this dosn't happen always and for everyone)
Consider using cache breakers dependent on deployment version. You can also try our *.cloudcontrolapp.com routing tier which do not provide caching at all - http://dotgame2.cloudcontrolapp.com.
I made some changes to a CSS file, uploaded it and saw no change. I cleared my browser's cache, repeated the process and still nothing. I also tried another browser and then experimented with other files - all the same result.I then deleted the CSS file altogether - the website still looks the same and I can still see the files in the browser's console.
I can only get results if I actually change the file names altogether (which is really inconvenient). I dont think there is an issue with FTP overwriting the file as there are no errors in FileZillas logs.
Is there another way a website can cache its self? Would anyone know why this is occurring?
EDIT:
I also tried this in cPanel's File Manager and viewed it on another PC - same result
Squid and other web accelerators often sit between a hosted server and your browser. Although they are supposed to invalidate their caches when the backing file changes, that information isn't always sent to specification or acted on properly.
Indeed, there can be multiple caches between you and the server each of which has a chance of hanging onto old data.
First, use Firebug or "Inspect Element" in chrome.
Verify that the css file that the browser is loading the file you think is should load.
Good luck.
Browsers can cache things, did you try SHIFT-F5 on your webpage to force a reload of everything?
Maybe the main server has cached configuration setup to other servers, check with your IT department. If this is the case, you need to tell them to invalidate the cache through all the cached servers.
I had the same issue with fileZilla to solve it you need to clear the file zilla cache or change the name of the files you are uploading.
Open FileZilla and click on the Edit menu.
Choose Clear Private Data.
In the new dialog box, check mark the categories you’d like to clear: Quickconnect history, Reconnect information, Site Manager entries, Transfer queue.
Finally, click OK to validate