I need to upload an image, and display it, as well as save it so that I don't lose it when I refresh the localhost. This needs to be done using an "Upload" button, which prompts for a file-selection.
I am using node.js and express for the server-side code.
First of all, you should make an HTML form containing a file input element. You also need to set the form's enctype attribute to multipart/form-data:
<form method="post" enctype="multipart/form-data" action="/upload">
<input type="file" name="file">
<input type="submit" value="Submit">
</form>
Assuming the form is defined in index.html stored in a directory named public relative to where your script is located, you can serve it this way:
const http = require("http");
const path = require("path");
const fs = require("fs");
const express = require("express");
const app = express();
const httpServer = http.createServer(app);
const PORT = process.env.PORT || 3000;
httpServer.listen(PORT, () => {
console.log(`Server is listening on port ${PORT}`);
});
// put the HTML file containing your form in a directory named "public" (relative to where this script is located)
app.get("/", express.static(path.join(__dirname, "./public")));
Once that's done, users will be able to upload files to your server via that form. But to reassemble the uploaded file in your application, you'll need to parse the request body (as multipart form data).
In Express 3.x you could use express.bodyParser middleware to handle multipart forms but as of Express 4.x, there's no body parser bundled with the framework. Luckily, you can choose from one of the many available multipart/form-data parsers out there. Here, I'll be using multer:
You need to define a route to handle form posts:
const multer = require("multer");
const handleError = (err, res) => {
res
.status(500)
.contentType("text/plain")
.end("Oops! Something went wrong!");
};
const upload = multer({
dest: "/path/to/temporary/directory/to/store/uploaded/files"
// you might also want to set some limits: https://github.com/expressjs/multer#limits
});
app.post(
"/upload",
upload.single("file" /* name attribute of <file> element in your form */),
(req, res) => {
const tempPath = req.file.path;
const targetPath = path.join(__dirname, "./uploads/image.png");
if (path.extname(req.file.originalname).toLowerCase() === ".png") {
fs.rename(tempPath, targetPath, err => {
if (err) return handleError(err, res);
res
.status(200)
.contentType("text/plain")
.end("File uploaded!");
});
} else {
fs.unlink(tempPath, err => {
if (err) return handleError(err, res);
res
.status(403)
.contentType("text/plain")
.end("Only .png files are allowed!");
});
}
}
);
In the example above, .png files posted to /upload will be saved to uploaded directory relative to where the script is located.
In order to show the uploaded image, assuming you already have an HTML page containing an img element:
<img src="/image.png" />
you can define another route in your express app and use res.sendFile to serve the stored image:
app.get("/image.png", (req, res) => {
res.sendFile(path.join(__dirname, "./uploads/image.png"));
});
Related
I'm attempting to upload image files to my nextjs app where I'll eventually store in GCS but I'm having some trouble with the image form data. I'm using FilePond on the client to handle uploading the file and sending a req to a simple API that I have on the server.
// Component
import { FilePond, File, registerPlugin } from "react-filepond";
import FilePondPluginImageExifOrientation from 'filepond-plugin-image-exif-orientation';
import FilePondPluginImagePreview from "filepond-plugin-image-preview";
registerPlugin(FilePondPluginImageExifOrientation, FilePondPluginImagePreview);
const Page = () => {
const [productImages, setProductImages] = useState<File[]>([]);
return (
<FilePond
allowMultiple={true}
maxFiles={2}
files={productImages}
onupdateFiles={setProductImages}
server={{
process: {
url: "/api/upload",
method: "POST",
headers: {
"Content-Type": "mutlipart/form-data"
},
ondata: formData => {
formData.append('image', "test-image");
return formData;
}
}
}}
/>
);
};
export default Page;
// ./pages/api/upload
import { NextApiRequest, NextApiResponse } from "next";
const Index = (_req: NextApiRequest, res: NextApiResponse) => {
const reqBody = _req.body ?? null;
console.log(_req);
if (!reqBody) res.status(200).json({ message: "No request body found" });
res.status(200).json({ data: "OK" });
};
export default Index;
The issue I'm seeing is the files are being sent as a giant blob string and I've seen other people be able to access the files property from the incoming request (shown here). This is my first time building a file uploading feature into any of my projects so I'm not entirely sure what's best practice for handling files from incoming requests and parsing them to be stored in some file storage service like GCP or S3.
You might need to chunk the image file. set the configuration chunkUploads to true.
Then your backend should process the chunked file like this.
In my NuxtJS application I has a folder with html pages, that can be added/deleted in any time from outside (/static/pages/page1.html, /static/pages/page2.html, ...) and I got a mapping to real uri's for this pages
{ '/foo': 'page1.html', '/bar': 'page2.html', ... }
I know I can use #nuxtjs/proxy, but it requires to rebuild an app every time mapping changes. I also know I can use nginx's rewrites for this, but changing it's config every time is painful too.
I also tried using 'pages/_.vue' file, read .html in component and place it's content to html using v-html, but files contains full html page (w/ scripts), and nuxt throw and error in this case, 'cos v-html don't allow using js (or maybe another reasons, which I can't understand)
How can I make dynamic proxy for this in NuxtJS?
For someone looking for answer for same question
Solve this by creating simple server middleware
in /pages_proxy/index.js:
const path = require('path');
const { Router } = require('express');
const express = require('express')
const app = express()
const router = Router()
router.get('*', async (req, res, next) => {
const pages = { '/foo/': 'page1.html', '/bar/': 'page2.html', ... }
const page = pages[req.path];
if (page) {
res.sendFile(path.join(__dirname, '../static/pages', page));
} else {
next();
}
});
app.use(router)
module.exports = app
in nuxt.config.js
serverMiddleware: {
'/': '~/pages_proxy'
},
I have a website running VueJS at localhost:3000 which does some stuff to call this.nextImage().
methods:
// content //
async nextImage() {
console.log("In nextImage from App.vue"); // keeping track of location
try {
const response = await axios.get('http://localhost:5050/images');
console.log(response.data);
[how to make an image?]
} catch (error) {
console.error(error);
}
}
// content //
<template>
<!-- stuff -->
<div class="picture"><img :src="[what should go here?]" :alt="imageName"></div>
<!-- more stuff -->
<template
on localhost:5050 is an express server, which includes this:
const path = require('path')
// content //
app.get('/images', (req, res) => {
console.log("Express server: /images"); // tracking location
let imageName = 'myImage'
let imagePath = path.join(__dirname, '/images/' + imageName + '.jpeg')
res.sendFile(imagePath)
})
Logging the response.data gives
����JFIF���
!.%+!&8&+/1555$;#;4?.4514+$+44444444444444444444444444444444444444444444444444���"����B !1AQ2aq���"BR�����b�#3CSr���D��$%4����&1Q!Aa�2q�"��?�Z�UyEZL�>��ˀ��#�'G
��YU�U�$RlX�d<ǜ
(... abbreviated because I had too much code)
I need two pretty straightforward things:
The image to render properly
The name of the image
This was a pretty easy fix. I didn't actually need to send the file itself, just a link to the file (just app.send(imagePath)). When the client makes a GET call to the server, they get a url that can just be included in the img tag like so: <img source="imagePath">.
I am using minio to manage the files
const getMinioClient = () => {
const minioClient = new Minio.Client({
endPoint: '127.0.0.1',
port: 9000,
useSSL: false,
accessKey: 'minioadmin',
secretKey: 'minioadmin'
});
return minioClient;
};
uploadFile(bucketName, newFileName, localFileLocation,metadata={}) {
return new Promise((resolve, reject) => {
const minioClient = getMinioClient();
//'application/octet-stream'
minioClient.fPutObject(bucketName, newFileName, localFileLocation, metadata , (err, etag) => {
if (err) return reject(err);
return resolve(etag);
});
});
}
with the following code I can upload the file, after successfully uploading it returns me only with etag, but I want to get the download link, how would I get it directly without searching the filename again.
You won't be able to get something like Public URL/Link for accessing images unless you ask for it to manually generate a time limited download URL using something like:
https://min.io/docs/minio/linux/reference/minio-mc/mc-share-download.html#generate-a-url-to-download-object-s
One workaround is to let nginx directly access the location you are uploading your files to:
https://gist.github.com/harshavardhana/f05b60fe6f96803743f38bea4b565bbf
After you have successfully written your file with your code above, you can use presignedUrl method to generate the link to your image.
An example for Javascript is here: https://min.io/docs/minio/linux/developers/javascript/API.html#presignedUrl:~:text=//%20presigned%20url%20for%20%27getObject%27%20method.%0A//%20expires%20in%20a%20day.%0AminioClient.presignedUrl(%27GET%27%2C%20%27mybucket%27%2C%20%27hello.txt%27%2C%2024*60*60%2C%20function(err%2C%20presignedUrl)%20%7B%0A%20%20if%20(err)%20return%20console.log(err)%0A%20%20console.log(presignedUrl)%0A%7D)
In any case you have to set an expiration time. Here or you set a very long time, which is suitable to your app or if you have a backend, require the images from Frontend through the backend with the getObject method: getObject(bucketName, objectName, getOpts[, callback]).
https://min.io/docs/minio/linux/developers/javascript/API.html#presignedUrl:~:text=getObject(bucketName%2C%20objectName%2C%20getOpts%5B%2C%20callback%5D)
If you have only a few number of static images to show in your app, (which are not uploaded by your app), you can also create the links manually with tme minio client or from the Minio-UI.
I've written a custom field for KeystoneJS's AdminUI which uses TinyMCE's editor.
KeystoneJS runs an Apollo GraphQL Server underneath and auto-generates mutations and queries based on your CMS schema. TinyMCE has the capability to enter custom hooks to upload images.
I'd like to be able to connect the two -- upload images from TinyMCE to KeystoneJS's server using GraphQL mutations.
For instance, in my setup I have an Image field in the CMS. KeystoneJS has a GraphQL mutation that will allow me to upload an image
createImage(data: ImageCreateInput): Image
where imageCreateInputis
type ImageCreateInput {file: Upload}
This tutorial has an explanation of how to upload images from Apollo Client to an Apollo Server (which KeystoneJS is running).
const UPLOAD_MUTATION = gql`
mutation submit($file: Upload!) {
submitAFile(file: $file) {
filename
mimetype
filesize
}
}
`;
return (
<form>
<Mutation mutation={UPLOAD_MUTATION} update={mutationComplete}>
{mutation => (
<input
type="file"
onChange={e => {
const [file] = e.target.files;
mutation({
variables: {
file
}
});
}}
/>
)}
</Mutation>
</form>
);
I'm a bit confused as to how to integrate this into TinyMCE, particularly since the example is based on using a form, and TinyMCE sends me the data encoded in -- as far as I can see -- Base64.
TinyMCE provides me the opportunity to specify a custom upload handler :
tinymce.init({
selector: 'textarea', // change this value according to your HTML
images_upload_handler: function (blobInfo, success, failure) {
var xhr, formData;
xhr = new XMLHttpRequest();
xhr.withCredentials = false;
xhr.open('POST', 'postAcceptor.php');
xhr.onload = function() {
var json;
if (xhr.status != 200) {
failure('HTTP Error: ' + xhr.status);
return;
}
json = JSON.parse(xhr.responseText);
if (!json || typeof json.location != 'string') {
failure('Invalid JSON: ' + xhr.responseText);
return;
}
success(json.location);
};
formData = new FormData();
formData.append('file', blobInfo.blob(), blobInfo.filename());
xhr.send(formData);
}
});
It seems that TinyMCE provides me with a blob, when, as far as I see, Apollo Client expects a file name. Do I just use blobInfo.filename ? Is there a better way to upload TinyMCE images to a GraphQL Apollo Server?
I've never done any image uploading with TinyMCE before.
I've never used this, too ... but I would try to use file_picker_callback.
In this demo you can see files[0] - I'm pretty sure you can insert here a upload mutation call with file as an argument. Result (url) pass to cb() (as in example).
Also adjust configuration as in this answer