I'm building a Ghost application and it will be hosted on Heroku that can not store image uploads so what I did is to search around on Google and I found a way to upload images directly to Amazon buckets.
After a 4 hours fight I've been able to upload and read images from the S3 to the Ghost blog but what I wish also to achieve is to delete an image I'm not using anymore.
Let's say I upload an image, I don't like it and I replace it with another. The S3 keep both images.
There is a way to delete both images from Ghost and S3?
I'm using ghost-s3-storage. Thanks
My NativeScript Angular app uploads an image to the backend which has a 10MB limit. How can I determine the size of the upload prior to issuing the http POST request? The uploaded image is base64 encoded, as such:
let imgBase64 = (this.imagesArray.getItem(0).imgsrc).toBase64String("jpeg");
Depending on what tool you are using for uploading your images there are different approaches. The easiest to me seems to use nativescript-background-http plugin. The plugin will cast property change events for currently uploaded, total upload and current status,
For example look at this POC app that shows you the progress of the uploaded files in bytes
I am looking for a way to use dropzonejs to upload images to firebase storage. On regular way without dropzone it is no problem. but i like the functionalities of dropzonejs and want to use it.
I would appriciate any help.
Not sure this is possible without ripping out the upload internals of dropzone and replace them with our uploader. The other option would be to set up your own server and use a GCS uploader to upload to Firebase Storage.
I'm trying to build a E-commerce site with a admin page where the administrator can upload images of certain products.
I'd like Meteor to upload those images to a folder and then display those images in the product page of that product.
I know that normally the image files that the client will be using should be inside the 'public' folder, but I'd like to know more about what other options I might have.
Also, if I upload a new file to the 'public' folder or if I delete a file in the 'public' folder, the website refreshes itself...and this is good and bad at the same time depending on what effect you are after....
Here are my questions:
What if I create a 'uploads' folder in the server and upload the images to that folder. Would it be possible to display the images inside the 'uploads' folder in the client browser??? How??
Is there a way to use the browser to access the contents of the 'public' folder???
Is there a way to stop the 'reactivity' of the site if changes happen in the 'uploads' folder created?
Is uploading the images to the 'public' folder the best solution available to this problem?
Thank you very much for the help
When dealing with what will likely be a large number of images I like to offload not only the storage but also the processing to a third party.
My go-to app in this situation would be Cloudinary. Here's why:
Storage - I can store the original images outside of my application. A huge benefit to keep images in sync from dev to prod.
CDN - I get the extra benefits of images being quickly loaded from the Cloudinary CDN.
Off-load Processing - All of the processing of images is handled by Cloudinary which doesn't slow down my app as a whole.
Image Manipulation - I can make calls to the original image, calls to just get a thumbnail, and calls to change manipulate, ie :effect => grayscale. So if a 1000x1000px image was uploaded, I can request a 50x50px from Cloudinary that will return the image cropped to that exact size rather than using CSS to shrink a huge image.
Flexibility - I can change the size of images and return that exact size to the app without having to re-upload images. For example, if my product page pulled in thumbs at 40px, I could easily make a call to grab the same image at 50px.
Hope this helps.
http://cloudinary.com/
You can do all of this using the meteor package collectionFS. The package is well documented and you have a variety of options that you can uses for storing the uploaded files. CollectionFS also gives the ability for image manipulation on the upload, such as creating a resized thumbnail.
I realized this question is a bit old.
I had the same problem, one of the solution that works for me is using meteor-upload https://github.com/tomitrescak/meteor-tomi-upload-jquery
Definitely don't store stuff in the public directory - it will slow down starting up the app, and hot code refreshes on image upload could easily cause it to crash once there are a decent number of images in there.
Any of the above solutions with storing images elsewhere would work. One other option is using the peerlibrary:aws-sdk package to upload stuff to S3, which is what I use for several apps and have found to be very clean.
Storing the image as a base64 string in MongoDB is also a method. Useful for posting to APIs and save the worry of having to handle other 3rd Parties.
Images can be uploaded through Drupal's frontend interface with the Image module. However, I'd like to be able to upload and create image nodes remotely by requesting a URL and passing the image as a parameter. I have the REST API module, which works fine, but I can't figure out what function I need to call in the backend to create the image node. Does anyone know how to do this or if there's another module that does something like this?
Thanks.
The Gallerix module lets you turn on "Repository mode" that will make any images uplaoded to an FTP folder available to be included in its image galleries. It would be a simple step from there using Drupal 6 triggers and actions to publish a node containing only that image.