Store serviceAccountKey.json file in third party server - firebase-admin

I have an android app which gets its data from firebase realtime database. For updating the realtime database automatically, I've written a python script which crawls data from a website and processes it. Then it sends the data to my firebase realtime database using the admin sdk. I am willing to store and execute the script on my server, so that it is executed automatically twice a day. Is it safe to upload my serviceAccountKey.json along with it? If it is not, then how can I achieve my desired functionality?

Yes, it is fine to store the service account JSON file in your own server. That's the intended use case. Just make sure it's not exposed to users in anyway.

Related

Web API Application consumes endpoint, which needs periodic password changes - how to automate?

Our web application, WebAPI2, consumes an external endpoint. The external endpoint requires periodic password changes, so we would like to automate it, with the passwordChange endpoint it also has.
To prevent this being a manual job, we'd like to automate this. What would be the best way to accomplish this periodically from my own application web api?
So the 2 things I wonder about are:
How can I trigger it? (The requirement is every 90 days, but there's no reason why we couldn't do it more often.)
Where best to store it? config/Database/another
Thanks.
You could create a scheduled task on the server that runs ever 90 days, generates a new password and then simply encrypts that password and saves it to your database.
The WepApi then just needs a way of pulling the password from the database, decrypting it and using it when calling the external endpoint.
I wouldn't be modifying the web.config in this manner nor would i ever store passwords in a txt file.

Sending csv file via FTP to PythonAnywhere

My organisation uses Business Objects as a layer over its Oracle database so that people like me (i.e. not in the IT dept) can access the data without the risk of breaking something.
I have a PythonAnywhere account where I have a few dashboards built using Flask.
Each morning, BO sends me an email with the cvs files of the data that I want. I then upload these to a MYSQL server, and go from there. There is also an option to send it to an FTP recipient...but that's pretty much it.
Is it possible to set up an FTP server on my (paid for) PythonAnywhere account? If I could have those files go to a dir like /data, I could then have a scheduled job to insert them into my DB.
The data is already in the public domain and not sensitive.
Or is there intact a better way?
PythonAnywhere dev here: we don't support regular FTP, unfortunately. If there was a way to tell BO to send the data via an HTTP POST to a website, then you could set up a simple Flask app to handle that -- but I'm guessing from what you say that it doesn't :-(

Migrating from parse.com

Say I have 10K users for my app and I want to switch to my own custom server for backend. I have seen the Parse export functionality but I don't get how it can help me in this situation.
I mean even if I export all data and make updates to app so that it makes calls to my new custom server, still, it will take months for all my users to use updated version of app(many users don't update immediately, my last update on fb was year ago).
Also, during this transition half of my users would be having their
data on my custom server and other half(those who haven't updated)
would be using parse server, so for queries that require all data in one place this becomes an issue (I could solve this via replication but imagine how slow it would be in realtime to push the data to both - my server and parse.com).
Has anyone thought about this ?
What you could do is when you release a new version of your app, when a user logs in and they are on parse, migrate their data at that point to the new server and from that point on that user uses the custom server. That way users move to the new server as they upgrade, I always have a flag that is fetched from my server to force the user to upgrade if is needed. Hope that helps.
Copying data over to your new backend periodically until you have finalize your mobile client code and then allow the user to update their app on the App Store or Google Play Store would provide the switch over. Doing that elegantly would be dependent on the type of app and user base you have for the app. I wrote up a part 1 of a blog on these considerations for migrating over from Parse to Couchbase Mobile stack and the reasons why to consider the stack.
If you can already attach a new system in place to have new data in two places (Parse and customer backend) then the copy and merge in the future might be easier to handle but this is case by case. Then when on mobile app update, you can depreciate the server. Or push data to have local store for those users who will be on older versions since Parse will eventually stop working. Any new experiences will require update to the new App version.

Saving third-party images on third-party server

I am writing a service as part of which a user chooses an image from a url (not my domain) and later he and others can view that image.
I need to save this image to a third party server (S3).
After a lot of wasted time I found I can not do it from the client side due to security issues (I can't get the third party image data and send it from the client side without alerting the client, which is just bad)
I also do not want to do the uploading on my server because I run Rails on Heroku and the workers expansive.
So I though of two options:
use something like transloadit.com,
or write a service on EC2 that will run over my db, find where the rows where the images are not uploaded and upload them.
I decided to go for the EC2 and S3 because the solution i am writing is meant for enterprise and it seems that it will sound better as part of the architecture when presented to customers.
My question is: what is the setup i need so I can access the Heroku db from an external service?
Any better ideas on how to solve this?
So you want to effectively write a worker, but instead of doing it on Heroku you want to do it on EC2? That feels like more work.
As for the database, did you see the documentation? It shows how to get the URL.
PS. Did you not find it in the docs?

Heroku architecture for running different applications but on the same domain

I have a unique set-up I am trying to determine if Heroku can accommodate. There is so much marketing around polygot applications, but only one example I can actually find!
My application consists of:
A website written in Django
A separate Java application, which takes files uploaded by users, parses them, and stores the data in a database
A shared database accessible by both applications
Because these user-uploaded files can be enormous, I want the uploaded file to go directly to the Java application. My preferred architecture is:
The Django-generated webpage displays the upload form.
The form does an AJAX submit to the Java application
The browser starts polling the database to see if the Java application has inserted the data
Meanwhile the Java application does its thing w/ the user-uploaded file and updates the database when it's done
The Django webpage AJAX-refreshes a div with the results of the user upload once the polling mechanism sees that the upload is complete
The big issue I can't figure out here is if I can get both the Django the Java apps either running on the same set of dynos or on different dynos but under the same domain to avoid AJAX cross-domain issues. Does Heroku support URL-level routing? For ex:
Django application available at http://www.myawesomewebsite.com
Java application available at http://www.myawesomewebsite.com/javaurl/
If this is not possible, does anyone have any ideas for work-arounds? I know I could have the user upload the file to Django and have Django send the request to Java from the server-side instead of the client side, but that's an awful lot of passing around of enormous files.
Thanks so much!
Heroku does not support the ability to route via the URL. Polyglot components should exist as their own subdomains and operate in a cross-domain fashion.
As a side-note: Have you considered directly uploading to S3 instead of uploading to your app on Heroku which will then (presumably) upload to S3. If you're dealing with cross-domain file uploads this is worth considering for its high level of scalability.

Resources