Suddenly all the changes I've done on a joomla(2.5) site has been lost. reseted to the original state. How can this be anyone?
How this can be, is a mistery to all of us (without the log files). How this can be undone, however, is a question worth answering.
One way is to set back your database back-up. That is, obviously, if you have made one.
Another way is to set back any export you would've made that holds the data you are now missing. That is, obviously... you get the point.
If there is no way you have secured your data, it is probably gone. You could contact your server administrator to see if he has any backup data, but other then that your data is gone.
Related
I built a translation system for fun which reads all trans()/Lang::get() calls in my app and presents them along with their current translations in the localisation files in resources/lang, so that an admin user can enter new translations which updates a single localisation file on the fly.
Everything works as intended, but there's one minor annoyance: every time the form is sent and the localisation file is updated, the page reloads (through a redirect()->route() call, not e.g. redirect()->back()), but most of the time, it still displays the old information even though the file has been updated properly.
If I refresh, the changes show up after 0.5-5 seconds, which makes me assume it's a cache issue. So the question is: can I trigger a language cache ignore while I'm in the translation system, or is there another and/or smarter way? I did try sleeping for a couple of seconds, but it made the user experience kind of crappy.
I've got the same issue.
I added in my controller sleep and info to extra refresh the page from js.
sleep(2);
return back()->with("refresh","yes");
and then in my view:
#if (session('refresh'))
<script>
location.reload(true);
</script>
#endif
I know it's a stupid solution but it works. If somebody know a better way to do it, write me a comment please.
I using CakePHP with Backbone.JS, I set up a controller just to give me a JSON output for getting my data, e.g. client names etc, to pass into each Backbone model.
This was all working, or appeared to be, however, it seems that it now gives me some random 403 errors when the page / from is saved or reloaded. But I have no idea why? If it can access it to start with, and does, then why would it not have access after a save or reload?
I have tried, $this->Auth->allow and it dose appear to fix the problem but this data is or could be important and need it not to be access my everybody who might guest at my access path.
Now I have read a number of articles on her, most point to read/write access on the files your accessing, but in my case its just a path /XXXX/XXXXX/myjson/clients For example.
Now I can post my code, if needed, but I am not sure what the problem is, is this a CakePHP issue or is Backbone not requesting the data right?
Please be aware that I am dyslexic, please be kind about my question, if I have not explained myself right. Then please be me some time to re-word / edit my post.
Thanks,
For any one else looking at this, I had added autoRegenerate to the Configure Write Session. For some reason it looks like CakePHP was taking to long to regenerate a new cookie and request my information at the same time.
I’ve tried several solutions available on forums and sites with tutorials, but none managed to solve my problem.
The sales reports are not displayed and I always get the message that I need to update statistics if the time zone has been changed.
I have already updated all the statistics, I changed the timezone in the admin and also in app/etc/config.xml (America / Sao_Paulo), but without success.
Please, I need to deliver this shop urgently this week.
Thank you.
I really doubt it is still useful for you to know the solution to the problem considering the amount of time since the date of publication. It may be interesting for other people anyways, so here it goes:
Even though the notice saying 'This report depends on timezone configuration. Once timezone is changed, the lifetime statistics need to be refreshed. Last updated: 15/10/2013 12:25:34. To refresh last day's statistics, click here.' claims to refresh statistics, it does NOT. The propper way to get it to work is by going to Reports->Refresh Statistics from the Admin Panel. Then all there is to do is selecting all items and refreshing them.
That is a neat solution that, unlike many others, avoid touching Magento's code.
If there is no any major issue in your magento then Refresh Life Time Statistic will probably fix the issue.
I am wondering if someone can help me, I have recently moved my site live to test it so I basically copied the database and uploaded the files. This means that I had already set up a user account on my local server. However, I have found the login to be a bit temperamental, when I went to login it wouldn’t accept it so I had to reset my password which it then worked fine. Then I decided to register another user and that worked fine however I went to log in the next day and it wouldn’t let me, even though I know that the details were correct…
Has anyone else experienced something similar? Also, if you have, how did you fix it?
Thanks
One thing to remember with Tank Auth is that the password hashes are localized to a single server, and will not work on any other. You can change that inside the tank_auth config file by changing phpass_hash_portable to true, but this is less safe. My recommendation is to just recreate the accounts.
That is the only problem I can think of in your situation, hope if fixes it.
Situation: Google has indexed a page in a forum. The thread is now deleted. How/whether can I make Google and other search engines to delete the cached copy? I doubt they would have anything against that since the linked page does not exist anymore and keeping the index updated and valid should be in their best interests.
Is this possible or do I have to wait months for an index update? Or will the page now stay there forever?
I am not the owner of the respective site so I can't change robots.txt for example. I would like to force the update as the "third party".
I also noticed that a new page on that resource I created two days ago is already in the cache. Given that can I make an estimate how long will it take for a non valid page on this domain to be dropped?
EDIT: So I did the test. It took google shortly under 2 months to drop the page. Quite a long time...
It's damn near impossible to get it removed - however replacing the page with entirely blank content will ensure that you nuke the ranking of the page when it is respidered.
You can't really make Google delete anything, except perhaps in extreme circumstances. You can adjust your robots.txt file to promote a revisit interval that might update things sooner, but if it is a low traffic site, you might not get a revisit very soon.
EDIT:
Since you are not the site-owner, you can modify the meta tags on the page with "revisit-after" tags as discussed here.
You cant make search engines to remove the link but don't worry soon the link will be removed as the link will not longer be active. You need not wait for months for this to happen.
If your site is registered with Google Webmaster, you can request to remove pages from the index. It works, I tried and used it in the past.
EDIT: Since you are not the owner, I am afraid that this solution would not work.