Joomla fetching feed issue after an upgrade - joomla

I have a feed problem and I hope that you can help me. A few days ago I upgraded Joomla from 2.5.28 to 3.4.4. The process was successful, and I thought everything was OK, but then noticed that my K2 feed isn't being fetched by a partner website. I have six partners whom using my RSS ( http://my.site/index.php?option=com_k2&view=itemlist&layout=category&Itemid=200&format=feed ), and five from six is still working, but one of them doesn't fetch. This is strange, because all partners are using the same feed link. I tried things, I removed old, unnecessary components, disabled unnecessary plugins, but nothing changed. This upgrade caused this, no doubt, but I don't know how. Maybe the problem is that the feed fetching stopped at my partners site, and can't start again, so I want something which can do this. Is there a method to force a feed refresh at my partner site, or do you have an other idea?
List of things that I tried:
- disabled unnecessary component
- disabled unnecessary plugins
- updated all components, modules and plugins
- removedd system cache and expired cache in administrator section
I hope you have ideas. Thanks in advance!

Since the feed is working for 5 other partners, then most likely the problem has to do with the 6th partner and not with the website (unless the sixth partner is using a different link).
It might also be that the way the partner is retrieving the feed is different than the others. I suggest you have a phone call with the partner to try to identify the problem on their end. The first question that you probably should ask the partner is if they can access the feed from their browser. The second question is to ask them which feed they are using in the code.
One more thing, if you have changed the IP of your website during the process then it might be that the partner is accessing the website using the old IP.
And, of course, it might be that the partner's server IP is blocked somewhere on your server (either by the application or by the server), so check that as well.

Related

Why does my Linkedin share button not work?

I want to create a share button for Linkedin, the GUI button is all set up, but it doesn't work when clicking it. I researched a bit and came to the conclusion that using the same sharing mechanism, other sites work but mine doesn't.
I narrowed the problem down and now I'm trying to figure out why google.com works but my site doesn't. I don't use my real company website because it's personal information, but it's a website that has been on the internet for more than 10 years (in case this information is useful). When I go to the links, my website throws an error, but Google works fine.
Ⓧ https://www.linkedin.com/cws/share/?url=https://www.my-company-website.com
〇 https://www.linkedin.com/cws/share/?url=https://www.google.com
Is there any pre-requisite I'm missing, which makes my site not work?
I realized my server was blocking Linkedin (to reduce traffic from Linkedin bots). That's why it wasn't working.
As hint: I was working on closed webpage (for outside users) and that also causing problems with Linkedin share button.
Hint 2: Website uses Lets Encrypt! SSL and information mentioned here is a fake news https://wordpress.org/support/topic/linkedin-share-button-not-working-3/ Work's fine!
If you ever get stuck on trying to figure out why your page simply doesn't populate nice preview data on your LinkedIn share page, then check out the LinkedIn Post Inspector.
Insert the URL of your page (i.e., example.com), not the URL you are using to share (i.e., linkedin.com/share?url=example.com). You'll get detailed information on how your site will appear and why, like, for instance, sharing wikipedia.org...
Hope this helps someone else with a LinkedIn share issue!

How to modify an old joomla website to remove a dangerous link flagged by google

A client told me his old website running on Joomla was flagged by google for having links to a malicious website. The website was blocked with the typical red security warning in google Chrome. I redirected the website to a temp page, but my client wants to bring back the old website while we work on something new.
However, my local machine and server are running Windows Server. I have the original files of the website and database. Is there a quick way I could remove the links (the google tool only mentions the website "mosaictriad.com") from the Joomla page from my machine? I've tried doing a crtl+f for mosaictriad.com in the sql file but didn't find anything.
Thanks for your opinion on what I should do next, the objective is simply to quickly clear the website from the security warning and send it back to the people managing his old server.
PS i don't have direct access to his server, only the files associated with his joomla website.
Additional details given my google:
Some pages on this website redirect visitors to dangerous websites that install malware on visitors' computers, including: mosaictriad.com.
Dangerous websites have been sending visitors to this website, including: navis.be and umblr.com.
Yes there is a way. You need to register in google webmaster tools. Register your site. Add the sitelinks. Ask google to rescan your website. They will remove it within 24 hours if scan result is negative for malwares.
Running the virus scanner on your local machine over the files may be able to detect some malicious files.
Alternatively, restore the website to a temporary folder on the web and use a commercial scanning service to help identify and clean the website. I use and recommend myjoomla.com but there are other services such as sucuri.net.
I think your strategy is wrong - you should quickly cleanup the website (try overwriting the core files with files from a fresh Joomla install) and you should then secure the website. Once you do that, you should contact Google through the Webmaster tools for a reconsideration request (this typically takes a few days to process if it's the first offense). Once Google approves your reconsideration request, then the red flag should be removed and the website should be accessible by everyone.

Magento Duplicate Orders

I have a Magento site using version 1.6.2.0 with which I'm experiencing problems with duplicate orders.
Having researched the subject I have found mostly forum threads explaining that 1.4.x had problems with duplicate orders and the solutions mentioned (even those on SO which I have found) merely suggest the user updates Magento to >1.4.
I have also found a proposed solution here but am reluctant to delete observers which will prevent downloadable purchases working.
I've also spotted the Array Of Death fix mentioned a few times (e.g. here) but this problem isn't present in 1.6.x, Zend appears to have resolved it.
There are a couple of Javascript hacks suggested whereby the Confirm Order button is hidden upon submission but Magento 1.6.x already does this.
I have increased the payment gateway timeout configuration variable to 120 seconds and am as yet to see if it yields results. I can't test it as the problem is intermittent (and probably therefore caused by communication or lack thereof between the payment gateway and Magento).
I am using Sagepay as the payment gateway.
How might I further debug this?
The link you posted is correct, but I wouldn't use their fix, I would just disable the Mage_Rss module.
Mage_Rss has several observers in it that call Mage::app()->cleanCache(...) in the checkout process, which is extremely expensive if your installation is using the default filesystem cache and it's gotten large.
I found the best thing for troubleshooting Magento performance problems is to wire up Xhgui and do some profiling. Reading call stacks will help your understanding of Magento immensely also.
Oh, and I don't know if this is true for Sagepay, but I went and fixed this problem completely for PayflowPro by rewriting the method that generates transaction IDs to use the quoteID instead of generating unique IDs on every invocation. I started down the path of committing this back, but I'm on 1.4.2 still and don't have time to test in later versions and it's a pretty significant rewrite. Guess I could just put it out there for someone else to run pass Moses...

Issues Logging in Twice After Installing Lightspeed Module

We are having an issue with logging in a second time on our site ever since we installed the lightspeed module. At first I thought this might have to do with the need for hole punching, but now I'm not sure.
If you try to log into our website the first time, it works well. However, if you log in a second time, it won't work. It just remains on the customer/account/login page with no effect.
I tried to test this on my end, and echoed the user's email in the loginPost function in the account controller. When echoing directly from the controller, it was obvious that the user was being logged in, but upon the redirect to a page on the site, the user was no longer logged in and appeared as a Guest in the Magento backend (where you view online customers).
It appears to me as if the session is being lost after the redirect. I am not sure if this has anything to do with a switch between https and http as described in the stackoverflow problem here (http://stackoverflow.com/questions/7823994/magento-session-lost-when-switching-to-https-from-http ) where they had also installed lightspeed. The person there resolved the problem, but did not post the solution. Their problem wasn't the same as ours, but I was thinking there may be a connection between the two.
Has anyone seen a problem like this before?
Thanks in advance,
Brenda

How to force a page to be removed from the search engine index?

Situation: Google has indexed a page in a forum. The thread is now deleted. How/whether can I make Google and other search engines to delete the cached copy? I doubt they would have anything against that since the linked page does not exist anymore and keeping the index updated and valid should be in their best interests.
Is this possible or do I have to wait months for an index update? Or will the page now stay there forever?
I am not the owner of the respective site so I can't change robots.txt for example. I would like to force the update as the "third party".
I also noticed that a new page on that resource I created two days ago is already in the cache. Given that can I make an estimate how long will it take for a non valid page on this domain to be dropped?
EDIT: So I did the test. It took google shortly under 2 months to drop the page. Quite a long time...
It's damn near impossible to get it removed - however replacing the page with entirely blank content will ensure that you nuke the ranking of the page when it is respidered.
You can't really make Google delete anything, except perhaps in extreme circumstances. You can adjust your robots.txt file to promote a revisit interval that might update things sooner, but if it is a low traffic site, you might not get a revisit very soon.
EDIT:
Since you are not the site-owner, you can modify the meta tags on the page with "revisit-after" tags as discussed here.
You cant make search engines to remove the link but don't worry soon the link will be removed as the link will not longer be active. You need not wait for months for this to happen.
If your site is registered with Google Webmaster, you can request to remove pages from the index. It works, I tried and used it in the past.
EDIT: Since you are not the owner, I am afraid that this solution would not work.

Resources