Best way to store uploaded files in a Spring MVC environnment - spring

The question is quite easy: what is the best way to store uploaded files in a clustered Spring MVC environnment?
Example: let's say I'm coding a social network and I have to possibility to upload my personal profile picture. The user will then have at most one image associated with his profile.
A solution can then be to add a blob column to the users table in the DB — I read this is good when in a clustered environment (easier to scale the DB than a folder containing lots of images). Is it true? How can I achieve this in a JPA (Hibernate and PostgreSQL) environment? I saw there is the #Lob annotation but what type of variable should I use? byte[]?
Another solution is to store them on the hard drive. What is the best path to store these images? In the webapp folder? In the classpath? In another folder outside the Spring context?
Thank you very much for your help.
Update: an important detail that I forgot to say. The administration dashboard (CMS/back end) of this website will be coded in PHP (I know, I know...) while the front-end will be coded in Java Spring (MVC). The database is all managed by the java part (JPA). Whatever the final choice will be, it has to be compatible with this requirement.

I'd rather not store it in DB. The best place is some server for static files (or CDN).
If you really need you can store is as a Lob but I think it's a bad idea for performance scalability reasons.
What is more important, databases seems to be more expensive than simple Content Delivery Networks.

Related

spring boot multiple microservices with one database

I know there are many questions like this and almost all answers are No. And the reason is a single microservice should be independent of another one. And if there is a change in a table, all microservices using that table need to be changed.
But my question is, if my database structure is fixed (hardly there will be any change in the table structure) will it be a good idea of creating multiple microservices pointing to same database.
Okay... here is my project.
We are going to a migrate struts 1.3/EJB 2.0 project to Angular/microservices. This project has 5 different modules and each module is a huge one. And this project is there in production since past 13 years. So there is very little chance of changing the table structures.
The reason I want to make different microservices is, since each modules are huge and complicated, and we still get requirements to add/change the business logics. So in that case, I can deploy only one microservice.
Any suggestions please.
I suggest creating a new service that access that database and all other services communicate with this service instead of directly to the database.
If you don't want to create a new service, at least access the DB using some database
abstraction layer.
For example, in SQL server use views and store procedures instead of directly access the tables.

How to make a text file to be the "database" in a Spring Rest Application?

I´m developing a Jokenpo Game using React with Spring Rest, but I can´t have a database to store all the information needed(create and delete moves, create and delete players).
I don´t know the best practice of development, or if there is some design pattern on how to store that kind of information. I know there is the folder src/main/resources where maybe I can store a text file there and thought about on the startup of the api it loads that file with the begin of the game, maybe, and after changing it during the game.
Trying to be more clear: I just would like to know the simplest way of storing information without being a database inside of a Spring Rest application. I really appreciate any helps. Thanks.
Take a look at SQLite. It's a very light database library that you can include as a dependency of your Spring application, It doesn't require a separate database server to run, and the entire database is stored in a single file, that you can choose where to store in the connection string.
It offers the flexibility of a standard database, so you can use Spring Data / JPA to access the data. It has some limitations compared with robust databases like MySQL, specially related with concurrent writes that you should investigate and be aware of. Usually it works very well for small applications or embedded applications.

When using asp.net MVC core + EF core + ability to encrypt files. What will be the differences between Blob, FileStream & File System, to manage files

I am working on an asp.net core mvc web application, the web application is a document management workflow. where inside each of the workflow steps users can upload documents, as follow:-
users can upload documents with the following restriction; a file can not exceed 5 MB + all the documents inside a workflow can not exceed 50 MB, unless admin approves it. they can upload as many documents as they want.
we will have lot of views which will show the step and all its documents attached to it, and users can chose to download the documents.
we can have unlimited number of workflows. as the more users register with our application the more workflow will be created.
certain files can be marked as confidential, so they should be encrypted when storing them either inside the database or inside the file system.
we are planning to use EF core as the data access layer for our web application + SQL server 2016 or 2017.
now my question is how we should manage our files, where i found these 3 approaches.
Blob.
FileStream
File system.
now the first approach, will allow us to encrypt the files inside the database + will work with EF. but it will have a huge drawback on performance, since opening a file or querying the files from database means they will be loaded inside the hosting server memory. so since we are seeking for an extensible approach, so i think this approach will not work for us since it is less scalable.
Second approach. will have better performance compared to first approach (Blob), but FileStream are not supported with EF + does not allow encryption. so we have to exclude this also.
third approach. of storing the files inside a folder which have the workflow ID + store the link to the file/folder inside the DB. will allow us to encrypt the files + will work with EF. and have a better performance compared to Blob (not sure if this is valid for FileStream). the only drawback, is that we can not achieve Atomic-ity between the files and their related records inside the database. but with adding some code we can handle this by our-self. for example deleting a database record will delete all its documents inside the folder, and we can add some background jobs to make sure all the documents have database records, other wise to delete the documents..
so based on the above i found that the third approach is the best fit for our need? so can anyone advice on this please? are my assumption correct? and is there a fourth appraoch or a hybrid appraoch that can be a better fit for us?
Although modern RDBMS have been optimised for data storage with the perks of integrity and atomicity, databases should be considered the least most alternative (StackOverflow posts like this and this shall corroborate the above) and therefore the third option mentioned or an improvement thereof shall be the vote.
For instance, a potential improvement would be to store the files renamed to a hash of the content and database the hash which shall eliminate all OS restrictions on subdirectories/files, filenames, and paths. Moreover, with a well structured directory layout duplicates could be filtered out.
The User-defined Database Functions shall aid in achieving atomicity which will efface the need of background jobs. An excellent guide on UDFs particularly for the use of accessing filesytem and invoking an executable can be found here.

In Windows Azure, what file should I put storage information in for ASP.NET MVC?

I'm toying with Windows Azure creating an ASP.NET MVC project, I've checked the WAPTK (Windows Azure Platform Training Kit), Google, and here for answers to my question, but I couldn't find the answer. In an ASP.NET MVC project, in what file do I create containers for cloud storage? (In the WAPTK, there's a hands-on lab that uses webforms and puts storage containers in the _Default partial class.)
In an ASP.NET MVC project, in what file do I create containers for cloud storage? (In the WAPTK, there's a hands-on lab that uses webforms and puts storage containers in the _Default partial class.)
Generally I'd recommend you set up the container access (including the create) in some nicely encapsulated class - preferably hiding behind an interface for easy testability.
I'd recommend:
put this class and it's interface in a class library (not in your ASP.Net project)
put the config details in the csdef/cscfg files
if you only plan to use a fixed list of containers, then either:
create these ahead of installing your app - e.g. from a simple command line app
or create these from a call in the init of Global.asax
if you plan to dynamically create containers (e.g. for different users or actions) then create these from Controller/Service code as is required - e.g. when a user signs up or when an action is first performed.
if actions might occur several times and you really don't know if the container will be there or not, then find some way (e.g. an in-memory hashtable or in-sql persistent table) to help ensure that you don't need to continually call CreateIfNotExist - remember that each call to CreateIfNotExist will slow your app down and cost you money.
for "normal" access operations like read/write/delete, these will typically be from Controller code - or from Service code sitting behind a Controller
If in doubt, think of it a bit like "how would I partition up my logic if I was creating folders on a local disk - or on a shared network drive"
Hope that helps a bit.
Stuart
I am not sure if I understand you correctly, but generally speaking files and Azure doesn't fit together well. All changes stored at the local file system are volatile and only guaranteed to live as long as the current Azure instance. You can however create a blob and mount it as a local drive, which makes the data persistent. This approach has some limitations, since it will allow one azure instance writing and maximum 8 readers.
So instead you should probably use blobs rather than files. The problem of knowing which blob to access would then be solved by using Azure table storage to index the blobs.
I recently went to a presentation where the presenter had investigated quite a bit about Azure table storage and his findings was that limiting partition keys to groups of 100-1000 elements would give the best performance. (Partition keys are used internally by azure to determine which data to group.)
You should definitely use blob storage for your files. It's not particularly difficult and as this is a new project there is no need to use Azure Drives.
If you are just using blob storage to serve up images for your site then you can reference them with a normal tag in your html e.g
<img src="http://myaccountname.blob.core.windows.net/containername/filename">
This will only work if the file or container is shared and not secured. This is fine if you are just serving up static content on html pages.
If you want to have secured access to blobs for a secure site then you have to do a couple of things, firstly your website will need to know how to access your blobs.
in your servicedefinition.csdef file you will need to include
<Setting name="StorageAccount" />
and then add
<Setting name="StorageAccount" value="DefaultEndpointsProtocol=https;
AccountName=[youraccountname];AccountKey=[youraccountkey]" />
to your serviceconfiguration.csfg file.
Then you can use the windows azure SDK to access that account from within your web role. Starting with
Dim Account As CloudStorageAccount = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("StorageAccount"))
Dim blobClient As CloudBlobClient = Account.CreateCloudBlobClient
From there you can
Read/write to blobs
delete blobs
list blobs
create time limited urls using shared access signatures.
There is a great resource here from Steve Marx. Which although it is about accessing blob storage from silverlight which you are not using does give you lots of information in one place.
Your question wasn't very specific but this should give you some idea where to start.
#Faester is correct you will probably need some resource either table storage or SQL Azure to store references to these files.

Whats the best way of persisting data to Isolated Storage on Windows Phone 7?

I want to persist objects into Isolated Storage, so far I could think of these ways:
Serialize them into an xml file when saving and then serialize them back when saving.
Use an Object DB. Doubt abounds about a good or recommended one (Examples are Perst, winphone7db and Sterling DB)
Anyone can suggest some best practices?
As a basic guideline:
If you need the functionality of a database (relations, transactions, search, etc.) then you should use a database.
If you just need an object store, then you should just save your objects into Isolated Storage directly (serialising where necessary).
I haven't used each of the different DB options available but would probably go with Perst as it's the most established (there's also a good guide here), winphone7db is also not available yet.

Resources