Design Project Insight and Ideas [closed] - project-management

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
First note, im not asking for code, more insight on the process flow.
i have a client who is a photographer. he takes pictures of his clients.
he wants to be able to upload them to his site, via web interface.
he also wants a directory for each client.
i.e. www.photographer.com/clientname
pictures will be retouched and resized jpgs.
what is the best method for storing those images .. mysql?
what method should i use to have my client be able to create new directories on the web server?
from there i would essentally list the jpgs out by file name, and hyperlink them to the acutal image that the client would click on.
thoughts and ideas are appreciated.

Does your client want actual directories on the server? Or does he simply want a directory-like URL? If it's the latter, I would use a CMS or mod_rewrite to abstract those URLs. You can have your own system of directories for image storage, and store only their filenames/paths in the database.
A CMS would also make your life and his life a lot easier. I know the more popular ones (drupal, joomla, wordpress) have various image galleries/storage systems you can add in, some allowing multiple simultaneous file uploads, which is not the easiest thing to implement from scratch.
Unless your client wants to upload everything via FTP, it's probably best not to ask his opinion about the directories, because he probably won't know the difference. I think your best bet is to implement the URL system he wants, along with an easy system for him to add/remove images. A well-implemented CMS would make it even easier and offer far more features than FTP, because that's what they were designed for. It would also be less error-prone, because it will have safeguards against deleting directories, etc.
...Or if you want to reinvent the wheel, you could write your own CMS. Either way, it's best to abstract the storage process and let the client access the site through the controls you set up, which will also let him attach more meaning and functionality to the information he's storing.

here are some recommendations:
- store the image files on disk. have a unique id assigned to the image files. these unique id(s) can be mapped to each client in a database.
- to create new directories you can either create them manually or have someone with admin access create them. having an admin access to the site makes it more secure. the admin can create a new director for the client and thereby enable settings for them to access them.
- having a user-based access would enable a client to view only those files that are specific to him/her

Related

Is it better to share pages with normal user and admin?

I am new to web development but I have some experience in VB.Net desktop app programming. The software I am currently maintaining is only accessible within the company office network. Now that the management demanded the software to be accessible online, I have no choice but to convert it as a web-based system.
I had already started developing the software using laravel framework and so far so good. I successfully created the login form, few pages with CRUD implementation.
However, I came to a confusing part of development stage where I want to prevent some type of users from editing/adding/deleting records. Basically, there are two type of users I'm using which are Administrator and Normal User.
I already made many sections and pages, and I'm thinking of creating a duplicate version of these pages and folders for normal users only. This way I can remove some of the edit/delete/etc from the pages that are not needed for the user. But I'm still in dilemma because if I create duplicates for these pages, it would be tiresome to make changes in future when maintaining it.
So, what do you guys think? should I make a modified copy of the pages or just use the same pages for all users and disable some features based on user type?

What are the possible ways of storing web contents ( images,videos,pdf etc)? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
We are planning to host one web app which uses multiple resources like banner images, videos, pdf and they need to be changed by time. If we package those resources in app, app size will get increased and in every change we need to repackage and redeploy.
So we have planned to have aws S3 and cloudFront cdn for serving all static web content and we can use them in application.
Please suggest pros and cons of our architecture and other possible ways of achieving it.
Yes. AWS S3 is indeed a very good choice for hosting your static assets.
As stated by AWS itself :
S3 is a highly durable, highly available, and inexpensive object
storage service that can serve stored objects directly via HTTP. This
makes it wonderfully useful for serving static web content directly to
web browsers for sites on the Internet.
What does your "app" do? Is it just to display static content? Or does it have a solid backend?
Since it is unclear, to get started, here is a wonderful resource from the official AWS site :
https://aws.amazon.com/getting-started/projects/build-modern-app-fargate-lambda-dynamodb-python/
They have clearly explained how to host your static content and structure your web app.
The pros of using AWS S3 are that it's really cheap, easy to use and configure.
Cons are ,IF you are just hosting static content, you will be charged for it.Why not use Github pages? It's entirely free!
You could use Azure Blob storage. You may store any file format and it can be secured with a security token for restricted access. It scales without limit and is considered a best practice for large volume web traffic. Hope it helps.

Google drive, onedrive or box which is the fastest API method for displaying complete folder contents? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
We are working on a web application for photo selections/proofing. To manage the photo albums, we want to use a file sharing service like dropbox, google drive, onedrive etc.
One of our use cases includes displaying albums with large quantity of photos inside our web application. We are currently using dropbox to fetch complete folders (albums) but since the dropbox api doesn't offer batch requests, it takes A LOT of time to generate individual share links and display inside our application (in terms of hours)
I am personally checking other APIs too but will really appreciate little guidance from people having experience with other APIs like google drive, onedrive, box.com etc
What will be the best method/api to achieve the use case efficiently?
Like I personally think if we have a batch operations available, we would be able to fetch an album of 3000 photos (120 kb avg size) in less than 5 minutes on a normal internet connection.
All services you mention (dropbox, onedrive, google drive) offer a thumbnail service. My advice is to use that functionality first.
So (supported by all services):
request meta data about all files in folder (one call), including thumbnail URL and id. Normally this request is paged. For 3000 images you need only three requests (if page-size is 1000, the maximum for Google Drive API)
display thumbnails (using the cached URL)
retrieve actual image if user selects thumbnail based on the cached id. Which is one call
We integrated with all three services (and s3) and in my experience, the performance is more than enough for this kind of scenario.

When not to use AJAX client-side routing? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Using client-side routing (with Angular's $routeProvider for example), it is possible to load a whole web app at once and not interact with the server anymore.
However this might imply a long load time when visitors first open the web app assuming it has a lot of views.
What are the best practices in terms of client-side routing vs. initial load time?
Well the default behaviour of angular is to have the whole app front loaded. But it depends what your app does and how big it is. If it's a small app you could do that. If your app is huge (unlikely given all the app will do is just show some static data) then it's not really a good idea to load everything up front. For a smaller app it would be OK, especially if you minify everything. But for larger apps, what if you have 10MB of scripts and resources? You're putting a lot of strain on your server and eating your customers bandwidth. Fr large apps you could dynamically load scripts as routes change, we do something similar to this in a pretty huge angular app.
The best practice would be to only get the files that are required to generate the content you want to show to the user on that specific route, which goes against what you want to do. Regarding "not interact with the server anymore", well if your app relies on a server to get some data or do some authentication, you can't really stop interacting with the server.

What service do you use to distribute software? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I work for a medium sized software company and have been put to the task of finding a new way of electronically distributing our software. We don't have a super fast connection to distribute it ourselves so it would need to be a solution that we can upload to and send out links to customers. The customers won't be purchasing our software from our website as we already do most of our sales from direct sales and partner sales. Since I joined the company we have grown from CD distribution sized downloads to DVD sized distribution downloads. We released a new version and find the YouSendIT Service to be clunky and 99% of our customers receive a link to download the software. We only send out a printed media if requested. Is there a service besides yousendit that allows for unlimited file size uploads/downloads. I have heard of drop.io and it seemed to be similar to yousendit. If you could please point me in the direction of Electronic software distribution system that is 3rd party hosted would be appreciated.
Thanks
Mike
You should look into Content Delivery Networks, such as Amazon CloudFront.
You might want to reconsider the way you are going about this.
If you software is open source, you should be using sourceforge. Otherwise you should just get a cheap hosting plan with lots of transfer bandwidth.
For example, godaddy has an unlimited account (unlimited transfer, unlimited space) for about $14.95 per month.
You point a sub domain i.e. download.rivageek.com to that server. This gives your users confidence when they download your application.
If they have to go to some ad laden 3rd party site they might think twice about giving you money. If you lose only 1 customer to that, it pays for itself (assuming you charge more than 14.95 for your product).
The fine print on many of those 3rd party sites mean they own whatever you upload as well.
If you'd like something that allows (simplisticly) secure one-time downloads, I've used filehosting.org in the past. They give you a hashed link to the software when you upload it, which you can then email to anybody you want to be able to download the file. If you want, you can set it to delete the file after one download.
In response to using your own domain for the downloads, it's possible to configure both Amazon S3 and CloudFront to use a custom domain name. Here are the instructions for S3 -- very straight forward:
http://docs.amazonwebservices.com/AmazonS3/latest/index.html?VirtualHosting.html
If emailing out a direct link to your distribution file (zip, etc.) is sufficient, I'd say go with one of these services -- they're very cost effective, reliable, and easy to set up.
You could use a filehosting service or get a regular web host with unlimited bandwidth just avoid Godaddy as its shared hosting is overcrowded and overbooked. (personal experience)

Resources