transfer normal website code to parse cloud - parse-platform

I have fully working website, which is developed with PHP & javascript. If I want to transfer it to parse cloud, Do I need to change whole (backend/ frontend) code for it? Please tell me the steps to follow. And how do I transfer mysql database to parse data?

These are General Steps, depends on the current website it may require more corrections/steps/polishing -
Create database equivalent to MySQL in Parse in terms of CLASSES/COLLECTION.
Create Cloud code methods equivalent to Serverside PHP methods.
Make changes in Client Side - for SERVICE/METHODS calls of PHP - to call PARSE Cloud Code Methods.
You have to put your Client side files in PUBLIC/ directory of PARSE Cloud Code.

Related

Parse Server Migration, Swapping Client Code

As a server rookie and Parse user, I need to migrate and I intend to migrate to Parse Server, likely with Heroku and mLab.
Once I have clicked Migrate and Finalise in the Parse Dashboard, all data from my original Parse client code goes to the new database, right?
Once migrated, I can just push an update of my client code with the new Parse Server SDK pointing to the new server?
My main over ruling question is do I need to do any management on the client side, sending data to both servers? Or does Parse migration handle this?
I think you are mixing two different things. Read the tutorial
Simply
Step 1
You should move your data from Parse.com to self hosted database (mLab or mongoDB and more...), this step means that api.parse.com will use the "external" database but you will still use the code and server from Parse.com (when you send query to your app it goes to api.parse.com and than it access the database) - do this till end of April 2016
Step2 move from api.parse.com to your own instance of Parse server (the one you download from github or install it on heroku). You will need change you code in your app because it wont use api.parse.com fro mthis point... - till end of July 2016
On github the developers still say that it is not "production ready". You should only migrate your database and build the whole server later. You can read the discussion here

How to move cloud code from parse.com to heroku

I have moved parse sever from parse.com to heroku. Everything is working fine except cloud code('cloud/main.js' file).
I have replaced "main.js" of parse.com with "main.js" of parse server code and deployed on heroku, but it is not working. Getting following error when I make request from my mobile app
{"code":1,"message":"Internal server error."} (Code: 1, Version: x.xx.x)
Any idea?
Note:
I've followed following link for migrating parse server
https://learnappmaking.com/how-to-migrate-parse-app-parse-server-heroku-mongolab/
Migrating cloud code can range in difficulty depending on how involved that code is. Here's a workflow for validating your code:
1) Check that you can build your Heroku app locally with the right Node version.
2) Comment out all of your cloud code. You want to start introducing your code in parts and make sure it compiles with each re-introduced function.
3) Install the node modules for each service that you use. If you use stripe/mailgun or any other package, add them in your package.json file and run npm install. Then include them in your main.js file with the require('packageName').
4) The cloud server uses Express.js version 4.2 and a Parse.com runs Express version 2.0 or 3.0 but not 4.0. If you use any middlewear then you need to change it to the proper Express 4.0 syntax/methodology.
5) There is no support for cloud jobs so rename all your *.job functions to *.define and comment properly so you can come back to them later. If you did not use cloud jobs then don't worry.
6) If you did use cloud jobs, now you need to setup a heroku worker/scheduler to run those old *.job (now *.define) calls at the proper time intervals you had.

How to query heroku database from salesforce (eg. via vf page)

can anyone help me on this? How can I achieve the following:
I need to set up a prototype integration with heroku from salesforce application. The need is to store a huge volume of data in a single heroku table and accessble from salesforce - eg: have a vf page in salesforce which queries the data in realtime
We're doing this. We use Heroku connect. We created a salesforce custom object and bidirectionally synched it using Heroku connect. You can then update the data on either side (in salesforce or in Heroku:PG).
But Heroku connect is expensive & priced on a # of synchs per month.
https://www.heroku.com/connect
Firstly, heroku is not a database. It is a PAAS.
My approach would have been :
- Create a heroku app(this is a link to create a node.js app) link
- After you create the app choose one of the datastore addons -> link if you want a SQL/Column oriented Database choose heroku postgres or ClearDB Mysql
- Secondly you actually need a webserver that exposes this database to you. I am thinking you would make this a webservice of some sort. You can build that in node.js. Here is something that will get you started or give you an idea - link

Lotus Domino: Migrating code changes to production, in clustered environment

We have clustered environment for domino server on production. I want to migrate code changes from staging to production. I have not changed signature for any of the old functions in the script library, but I have added a new function in the script library which is being called by a specific agent. All works well in staging. Now I want to transfer these changes to the cluster(consists of two servers) in production.
If I copy paste the new function(in script library) and also the changed agent which call this new function to one of the server in production, will these code changes automatically be replicated to the other server?. I mean what's the best way to migrate these changes?.
Thanks in advance.
Data and design elements get replicated immediately between clustered servers. So, if you change an agent or script library on first server the second server gets changes only seconds after.
Sometimes you get an error message "Error loading USE or USELSX module" after changing a script library. The error occurs if you call an agent or open a form which uses the script library. In this case, you have to recompile the agent or form to work the design elements properly with the new internal structure of the script library.
This error won't probably appear in your case as your changes work well in development environment. You should test all parts of your application which use the changed script library though to make sure it will work fine.
If you really want to make it seamless:
1) make your staging database a master template, and
2) make your production database inherit the design from that master template.
Then, on one of your production databases, Application > Refresh Design, and it'll ask what server to refresh the design from. Make this your staging server.
It's particularly important to recompile all LotusScript if you don't do this; otherwise, you may end up with "Type mismatch on external name: ". If you do this on your staging server, both the uncompiled and the compiled LotusScript design documents will be part of the design refresh, and it'll make things a lot easier.
Note that all clients must completely close and reopen the database to recognize any code changes. (This means 'the database tab itself, as well as any documents that are open from that database'.)

Local file blob save

I am a bit stuck with this Windows Azure Blob storage.
I have a controller that receive a file path (local).
So on the web page I do something loke this:
http:...?filepath=C:/temp/myfile.txt
On the web service I want to get this file and put it on the blob service. When I launch it in local there is no problem but when i publish it there is no way to get the file. I always get:
Error encountered: Could not find a part of the path 'C:/temp/myfile.txt'.
Can someone help me. Is there a solution ?
First i would say to get proper help you would need to provide better description about your problem. What do you mean by "On the web service"? Is it a WCF web role which seems to match with your partial problem description. However most of the web service use http://whatever.cloudapp.net/whatever.svc as well as http://whatever.cloudapp.net/whaterever.aspx?whatever if added. Have you done something like that in your application.
You have also mentioned the controller in your code which makes me think it is a MVC based Web Role application.
I am writing above information to help you to formulate your question much better next time.
Finally Based on what you have provided you are reading a file from local file system (C:\temp\myfile.txt) and uploading to Azure Blob. This will work in compute Emulator and sure will fail in Windows Azure because:
In your Web Role code you will not have access to write on C:\ drive and that's why file is not there and you get error. Your best bet is to use Azure Local Storage to write any content there and then use Local Storage to read the file and then upload the Azure Blob. Azure Local storage is designed to write any content from web role (you will have write permission).
Finally, I am concern with your application design also because Azure VM are no persisted so having a solution to write to anywhere in VM is not good and you may need to directly write to Azure storage without using system memory, if that is possible anyways.
Did you verify the file exists on the Azure server?

Resources