I am using google drive to show images on a website.
I'm using the following url to show the images:
https://drive.google.com/uc?export=view&id={fileId}
It works fine on most occasions, but some users don't see the images and get the following error message:
Failed to load resource: the server responded with a status of 403 ()
The image files and folders on the google drive are shared with everybody with a link to see.
The problem does not seem browser related.
Anyone an idea how to fix this? Are there some things I forgot to do (do I need some kind of API key in the link)?
I've dealt with this problem working with Ionic Angular, in the lastest versions you should use https://lh3.google.com/u/0/d/{img-ID} as Edso said and it solves the problem. Consider that the images will only load if you have a logged google account, otherwhise the status of the response will be 302 and image will never load.
ok i know i am late... and my answer may not work for you but it did for me.
i was getting the 403 forbidden error when i used this link
https://drive.google.com/uc?export=view&id={file-id}
the problem wasn't with the link. but with the file access. so, earlier when i viewed the file access it was like this
and the response in html file was like this
but if i just click on restricted and change it to anyone, the html was able to fetch it and it worked.
and then the html response was able to fetch it with a couple of redirects which were automatic
make sure public access is set to viewer or anyone who knows how to open devtools will be able to comment or edit your file.also i have shared original link as i have already deleted the file on that link before publishing this answer.
As of today I have the same issue. (just like a few months back when Google ended the classic sites, which I (mis)used as an image repository.)
After I cleared all cookies from the browser, the images were visible again.
So that solved this problem.
If your problem persists, you should change the URL.
This URL gives a 403 err:
https://drive.google.com/uc?export=view&id={img-ID}
Change that to:
https://lh3.google.com/u/0/d/{img-ID}
For example: setting a favicon in a WebApp:
return template.evaluate()
.setFaviconUrl('https://lh3.google.com/u/0/d/1qMrq5o54jhg5jedu9fLtzUG4dF1ijKZT1#.ico')
hey guys i am having a problem that i see others with online but i am yet to find a fix for my website.
facebook not detecting my images as of a couple of hours ago, i tried disabling hotlinks in cloudflare, tried fb debuger and scrapping tool etc no solution.
example if you try to share this page, no picture appears: http://www.yardhype.com/12-year-old-jamaican-girl-wins-master-chef-junior-competition/
ERROR message
"Provided og:image URL, ........testsite/wblob/5433A6CF57A599/2803/3DF62/w3l6_ERf2C3ADvxqhgccQg/jasmin.png could not be downloaded because it exceeded the maximum allowed sized of 8Mb."
the image is not over 8mb
ALso correct url is shown in the raw tag area
When I opened that link http://www.yardhype.com/12-year-old-jamaican-girl-wins-master-chef-junior-competition/ I don't see the image regardless; I opened up Chrome's debugger and saw this:
A Parser-blocking, cross site (i.e. different eTLD+1) script,
http://ajax.cloudflare.com/cdn-cgi/nexp/dok3v=85b614c0f6/cloudflare.min.js,
is invoked via document.write. The network request for this script MAY be
blocked by the browser in this or a future page load due to poor network
connectivity. If blocked in this page load, it will be confirmed in a
subsequent console message.See
https://www.chromestatus.com/feature/5718547946799104 for more details.
So the way that page is displaying the image is likely not up to standards.
I've got some automatic emails that are sent out upon signup completion for my site.
Until recently, they worked fine. Now Google's new system is rewriting the images and storing them in it's cache (supposedly)
However, Google's new rewriting of my image links are completely breaking them, giving a 500 error and a broken link image.
Lets say my normal image url is:
http://www.mysite.com/images/pic1.jpg
Google is rewriting this to:
https://ci5.googleusercontent.com/proxy/vI79kajdUGm6Wk-fjyicDLjZbCB1w9NfkoZ-zQFOB2OpJ1ILmSvfvHmE56r72us5mIuIXCFiO3V8rgkZOjfhghTH0R07BbcQy5g=s0-d-e1-ft#http://www.mysite.com/images/pic1.jpg
However, there is nothing at that URL.
So, either there is something wrong with the links that are being created by Google or the images are just not being uploaded to the googleusercontent server, but I have no idea how to solve the issue.
Im using PHP, the phpmailer library and a Ubuntu server on Amazon EC2, but Im not sure that is related to the issue.
I think I have figured out the GoogleImageProxy issue.
This is something related to CACHING concept. suppose, you have recently deployed your PHP code on your server but you forgot to upload images. you tested once with your email logic. your system generated an HTML email. When this email will hit the Gmail server GoogleImageProxy will try to fetch and store the images from your site on its own proxy server. while fetching the images, GoogleImageProxy found some 404 statuses against your missing images and 403 against some protected images. GoogleImagesProxy has stored these statuses into its own proxy server.
Now tried to open your email, and you noticed some 404 statuses against your images. This is something understandable. You immediately realized that you forgot to upload some images, so you uploaded them to your server. and also you have fixed some permissions against protected images.
You are all done now. Now you try to run your PHP-email script once again. As a result, you receive another email in your Gmail or Hotmail inbox. you had fixed all the issues with your images. Now the images must be displayed in your email content. but you are still unable to see the images.
Ah, possibly you forgot to clear your browser's cache. Clear your browser's cache and load the Gmail or Hotmail page once again. But the result will be still the same. Try to apply dozens of fixes/patches and try to run your PHP-email script a thousand times. But the result will be still the same. No improvement.
THE REAL PROBLEM
What the hell is going on? Let me explain it to you. Go to your access log and try to find requests from GoogleImageProxy. You'll be surprised to see that there will be only 2 or 3 three requests from GoogleImageProxy depending on the number of different images used in your email. GoogleImageProxy never tried to fetch images Even after you have fixed the issues with your images by uploading missing images and setting permissions for protected images. Why? Clearing your browser's cache has no impact. GoogleImageProxy will never fetch the fresh images even for your newer email because the images are now cached into GoogleImageProxy along with their last status code and not cached in your own browser.
GoogleImageProxy has set its own expiry date for the images. I think one month. so now the fresh copy of images will be fetched after the expiry date. I mean after one month. You can not force GoogleImageProxy to fetch the images. But it is important for you to display images in your email. What can be the solution?
THE SOLUTION
Following is the only way to force GoogleImageProxy to fetch your images
Rename your images to something else with png, jpg, or gif extensions
only.
Don't use any kind of query string in your image URL like ?t=34343
your image must include png, jpg, or gif as an extension.
your image URL must be mapped onto your image directly.
If you need to use some proxy URL for your protected images then your response must include the proper header like
Content-Type: image/jpeg
File extension and content-type header must match
Status-code must be 200 instead of 403, 500, etc
IMPORTANT NOTE
Try to repeat the whole process for every run of PHP-email script. because every time GoogleImageProxy will cache your images and you'll have to repeat the same process for every new try.
Hopefully this will fix the issue for most of the people.
Based on your example, it looks like you are using traditional extensions (.jpg, .png, .gif). Some folks on this thread, describing the same issues you are facing, have stated that using those extensions solves the problem.
Other possible solutions:
Image links broken in Gmail because of google's Image proxy
Doubtful, but maybe a cookie problem
Image URL proxy whitelist setting - this has turned out to be the solution for a few users who are under Google Apps. Via Gmail is not showing image when image url is getting appended with https://ci4.googleusercontent.com/proxy
I was having a similar issue, but it was caused by the length of the URL. Google generates the following URL when caching an image from gmail:
https://ci4.googleusercontent.com/proxy/[hash]#[url])
The hash generated is based on the URL of the image, but the size will vary based on characters used. I ran several tests with different sized URLs, and found the cached image would fail to load consistently (400/Invalid Request) if the hash exceeds 2076 characters in length (close to 2048 bytes + meta? not sure).
Again, the image URL could generate a hash that exceeds this many characters at ~1000 special characters, or 1500+ simple characters. If the hash exceeds 2076 characters in length, the request fails.
I realize this is an old post, but hopefully this helps other devs scouring Google
I know this is an old question but the same thing happened to me. When I checked my access logs this is what I found -
www.example.ca 66.249.85.50 - - [10/Apr/2014:17:57:18 -0400] "GET /newsletters/Apr10_2014/cad/cad2.jpg HTTP/1.1" 403 457 "-" "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.0.7) Gecko/2009021910 Firefox/3.0.7 (via ggpht.com GoogleImageProxy)"
You can see that my server was blocking the GOOGLEIMAGEPROXY giving it a 403 Forbidden reply. I decided to check my .htaccess and sure enough I was blocking the term PROXY. After removing the term, the images appear just fine now on Gmail. Hope that helps.
I just tried ,
after replacing the image (without changing image name)
Open email in new browser , it shows new image
Ctrl+f5 (forces a cache refresh) in the chrome (my default browser) ,
also shows new image
use .png or .jpg
otherwise image will not render
url add auto
https://ci3.googleusercontent.com/proxy/jTpYlM6RUv7Wi8Hxjha4fzExKFy9mjyh133MKKfo3FuV3toLToG6zJcA0IAdIMEW75pY6pkEd2aOSVhWIn0A82q-24YaAd-_k00wIMHwIuUBiy9vEGrMpAW73HaHQmViuESP7A=s0-d-e1-ft#
HTTPS image locations do cache. Several of our production environments have no problems with gmail proxying image locations using a HTTPS uri. I could see gmail ignoring your content if the SSL certificate is invalid in some way.
Check that the content-type returned for the image file by your server is correct.
You can check this using Fiddler.
In my case the size of file was the problem, it was 22 Mb (i know right?), and after we reduced the size everything started working like a charm.
Check file size and if it's too big, compress it.
I know this is an old question but I've met this problem. In my case images are stored at Google Cloud Storage. What is interesting is that link
https://storage.cloud.google.com/{bla_bla}/logo.png
returns 307 (temporary redirect) and Location header containing something like
https://{xxx}-apidata.googleusercontent.com/{bla-bla_bla}/logo.png?{zzz}
Seems like GoogleImageProxy does not process 307 correctly
I have a perfect solution of this problem, which worked for me if you are using PHPMailer then you just have to add another option in PHPMailer for attaching image like this
$mail = new PHPMailer();
$mail->AddEmbeddedImage('../absolutepath/image/image.jpg', 'logoimg', '../absolutepath/image/image.jpg');
Here we have given absolute path of image and give it a name call 'logoimg' or whatever you want.
Now you can add this logoimg to wherever in your HTML Body like this
$mail->Body = "
<h1>Test of PHPMailer html body with image</h1>
<p>This is a test picture: <img src=\"cid:logoimg\" /></p>";
$mail->send();
That's All.
I had this issue when I was sending gifs. I found that the file size matters to Googles Proxy server. I suggest making the files as small as possible and see if that works. You can use your Gmail account and add a photo from a URL to test. If the gif shows up when you are composing your email it will be receivable.
happy coding.
Is it working from Outook/hotmail? It should then we can isolate it as google issue. In your case it is not.
Size of the image can be a problem. Try to reduce it and see
www.mysite.com this site might be accessible from your system. But
is it also accessible from google server?
Try changing extension.. this is the trick: You might have tried several things but it would still fetch from cache(which invalidates your efforts) but when the extension changes, it fetches again and all the work you did before comes into play and if it works you might think it is the 'extension' that did the trick!! (like many of those who speaks about extensions)
In my case of running into this issue, the problem was that accidentally the path to the image in the email template had triple slashes in the URL, e.g. https:///content.example.org/image.png. This was hard to spot, and while it was working in other email clients who could successfully resolve the URL, Google's image proxy wasn't able to handle it and resulted in a 404 for the proxied image address.
It's March 6 and you've probably already figured this out, but thought I'd chime in to help others. I discovered that JPGs don't work in gmail. The PNG format works great. Sorry I can't explain why, but sometimes it's better not to ask why. Use PNG!
Here's how I make develop a bookmarklet, get the input control value on web page ,
I write a javascript function, add the bookmarklet to my browser, load my test web page, is test the bookmarklet, the result is ok,
but then i test the bookmarklet on HTTPS website ,the bookmarklet can not get the input control value, why? the bookmarklet doesn't work on the HTTPS website?? Is there any way to make the bookmarklet work on https sites?
3 questions :
Why cant you get the input value : there is no reason why it does not work, almost certainly you are looking for the wrong id.
Do bookmarklets work on HTTPS : absolutely, HTTPS is not the problem
Can I make it work on https sites : if you provide a code sample, we might be able to tell you what is wrong with it.
I know this is a pretty old question, but since I came across it while searching for a similar problem, I will add my thoughts. If you wrote your own bookmarklet, this is most likely caused by your bookmarklet trying to access insecure content. If you have other static content that your bookmarklet references on your own server, such as HTML, JS, CSS, or image files, the browser will block that content from loading. This is because of the Same Origin Policy. This question is also discussed in this question. If you, or someone else viewing this is having the same problem, attempt to serve your content up as https or access only other content that is https.
I found this problem all over the net but no answer yet, so maybe here someone solved it ...?
I built a page relying heavily on jquery.address. It's got one index page and the rest loads dynamically via Ajax following Google's /#!/ scheme for crawlable pages. Now I want to add Facebooks Like or share button but I can't get it to grab the actual page title or url.
Whatever I do, it always falls back to title and url of the index page. It tried:
(obviously) changing title an openGraph meta on load of the new parts.
"linking" the crawler page (?_escaped_fragmet_=xyx) but specifying the #! page in meta
"sharing" with a given title and url.
I never get anything but a link to the index page or a blank "share" to the right url with title and thumbnail ignored.
Has anyone got a similar setup working?
Thanks for any hints,
thomas
Facebook is actually using #! now and it works! If you build your site so that http://site.de/?_escaped_fragment=something is identical to http://site.de/#!/something all you have to do is "share" the #! url and it'll display the info from the escaped fragment page.
Use this URL to check: http://developers.facebook.com/tools/debug
But: A much cleaner solution to the problem can be found here: http://github.com/browserstate/history.js/wiki/Intelligent-State-Handling
My guess would be that Facebook's crawler doesn't run Javascript and will always display whatever's actually in the page it gets from the server.
Facebook share has a BRUTAL cache, last time I checked it was impossible to change the title / description data once it was scraped :(
The issue I had was the og:url and the actual url of the page did not match. I also read a number of comments about the og data being just after the title element, but I don't think that solved anything.
With regard to issues of caching, it is true that Facebook's caching is "brutal", but it does not cache anything for the lint tool: http://developers.facebook.com/tools/debug.
I use no-hash-bang urls when sharing links. I process the hard links and redirect them to a hash bang client side using javascript. That way if a crawler goes to the hard linked page it will display the information just as it would if javascript were enabled.
Compare:
http://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Flikeapage.com%2F%23!%2FChristmas%2Fvs%2FBacon
and
http://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Flikeapage.com%2FChristmas%2Fvs%2FBacon
Hope this helps.