sharing data with heroku apps - heroku

Hi what is the recommended way of sharing data in heroku apps.
The reason why I ask is that I have a scheduler app which runs a process every 5 mins and places data in a memcache (memcachier).
I would then like another servlet based app to be able to read that same memcache data and return it to the user.
I tried to do this but the data is returning null.
Would it be better to use a database or is there any other way of doing this.
Can the memcache be shared across dynos?

Yup, all these things are connected as attachable resources via your config variables. It's perfectly OK to have many apps using the same attached resources.
http://www.12factor.net/backing-services
http://www.12factor.net/config

Related

WildFly 10 HA deploy: not losing sessions

I have been reading posts and documentation all day long about that topic, and still can't find something easy to understand and trust on.
I currently have my webapp deployed on WildFly 10, as a simple war file.
It's an e-commerce website, in production for a few weeks, and every time we need to deploy a new release, well... that's very annoying, because some customers could be shopping right now, and deployment will obviously make them lose their sessions, and that's very bad.
I need a solution to deploy a new war without restarting the application server. At first, I read the docs about clustering (domain configuration over standalone configuration), but I'm not sure that's enough for me...
Imagine the same customer with a few items in the shopping cart (http session), accessing the first node of the cluster.
Then I put it down, because I'm deploying.
OK, the customer will be redirected to the second node of the cluster but... will the session data still be available? Will he 'lose' the shopping cart items?
I read about sticky sessions, but nothing about configuring them in WildFly. I am on Amazon AWS, so I can use ELB (load balancer), too.
Can you help me understand exactly what I need to learn and use?
Each WildFly instance will have it's own session id that it keeps in the cookie. This id will only restore the session on the particular node it came from.
Sticky sessions mean that the ELB will always redirect the user to the same node in your cluster so that won't quite solve your problem.
Some things to think about:
Clustering
Clustering may help (doesn't need to be domain mode). With HA enabled, the session will be transferred between the nodes automatically so that cookie on the clients browser will be able to restore the session on either node. This of course has the issue where if you upgrade one of the war files first you may have an object that can no longer be de-serialized because it changed.
Clustering WF on AWS is a little tricky as well because you can't use UDP broadcast to discover each other. We use a database connection to keep track of nodes and do clustering.
Roll your own
One option you could do is to roll your own solution to keep just the minimum amount of information on the client as required. Something like:
create a record in the database with a GUID.
set the GUID to a cookie
Save the items in their cart in the database based on the GUID
have a filter that checks for the GUID cookie and can restore their cart each time they hit the site.
I've used an approach like this for e-commerce apps in the past. It has another side effect that you now have the person's shopping cart saved in your database and it's easy to see exactly what people were interested in buying.
Use Tomcat parallel deployment
Does your application require a full app server? If it is just servlet based you could try using Tomcat and it's parallel deployment functionality. It allows you to deploy your new .war file on top of your old one. It will then keep serving old sessions to the old war but new sessions will go to the new war file.
Parallel deployment is very cool if your app is simple enough to be able to use tomcat.

Saving third-party images on third-party server

I am writing a service as part of which a user chooses an image from a url (not my domain) and later he and others can view that image.
I need to save this image to a third party server (S3).
After a lot of wasted time I found I can not do it from the client side due to security issues (I can't get the third party image data and send it from the client side without alerting the client, which is just bad)
I also do not want to do the uploading on my server because I run Rails on Heroku and the workers expansive.
So I though of two options:
use something like transloadit.com,
or write a service on EC2 that will run over my db, find where the rows where the images are not uploaded and upload them.
I decided to go for the EC2 and S3 because the solution i am writing is meant for enterprise and it seems that it will sound better as part of the architecture when presented to customers.
My question is: what is the setup i need so I can access the Heroku db from an external service?
Any better ideas on how to solve this?
So you want to effectively write a worker, but instead of doing it on Heroku you want to do it on EC2? That feels like more work.
As for the database, did you see the documentation? It shows how to get the URL.
PS. Did you not find it in the docs?

How do I share sessions between 2 Heroku apps?

In the context of What is the difference between a Cookie and Redis Session store? , how can I share session data among Heroku apps (Sinatra frontend with Rack API backend)?
These two questions have suggestions on how to accomplish this:
Subdomain Session Not Working in Rails 2.3 and Rails 3 on Heroku with/without a Custom Domain?
Rails Checkout SSL heroku
I think what you would want to do is use an external session store (such as Redis), and simply copy the Redis environment values from the first app into the second. You should then be able to access the same session data from both Heroku apps.
EDIT 1:
It also occurs to me that as it is two seperate Heroku apps, you will (probably) have two seperate domains/subdomains. You will need to ensure that your session cookie allows access from both the domains.

Heroku architecture for running different applications but on the same domain

I have a unique set-up I am trying to determine if Heroku can accommodate. There is so much marketing around polygot applications, but only one example I can actually find!
My application consists of:
A website written in Django
A separate Java application, which takes files uploaded by users, parses them, and stores the data in a database
A shared database accessible by both applications
Because these user-uploaded files can be enormous, I want the uploaded file to go directly to the Java application. My preferred architecture is:
The Django-generated webpage displays the upload form.
The form does an AJAX submit to the Java application
The browser starts polling the database to see if the Java application has inserted the data
Meanwhile the Java application does its thing w/ the user-uploaded file and updates the database when it's done
The Django webpage AJAX-refreshes a div with the results of the user upload once the polling mechanism sees that the upload is complete
The big issue I can't figure out here is if I can get both the Django the Java apps either running on the same set of dynos or on different dynos but under the same domain to avoid AJAX cross-domain issues. Does Heroku support URL-level routing? For ex:
Django application available at http://www.myawesomewebsite.com
Java application available at http://www.myawesomewebsite.com/javaurl/
If this is not possible, does anyone have any ideas for work-arounds? I know I could have the user upload the file to Django and have Django send the request to Java from the server-side instead of the client side, but that's an awful lot of passing around of enormous files.
Thanks so much!
Heroku does not support the ability to route via the URL. Polyglot components should exist as their own subdomains and operate in a cross-domain fashion.
As a side-note: Have you considered directly uploading to S3 instead of uploading to your app on Heroku which will then (presumably) upload to S3. If you're dealing with cross-domain file uploads this is worth considering for its high level of scalability.

Can I get all the performance metrics of an asp.net MVC 3 app from within the app itself?

So say I've got an MVC app hosted in the cloud somewhere, meaning I don't have access to IIS or any infrastructure.
All I have control over is the App code itself, and what comes down to the client.
My goal
Is to collect data over time of how well the MVC app is performing in terms of response times.
Current Problems
I can get a lot of data from Google Analyics, and other client-side tricks, but that won't tell if say, the App Pool is recycling too often.
Similarly if I put stop watches in the actions, that won't tell me about any delays in the App Startup (if it has to start up again).
Also, if I do put a stop watch in the Action, it doesn't take into account any delays in redering the View. For example, even though it's bad practice, there might be a DB call being made from the View, and my action metrics won't take that into account.
My Question
So, if I want to get true metrics of how long requests are taking overal from mulitple clients and users, where are the best places to but Stopwatches in the App. Or is it impossible to get true metrics from the app itself, and I have to place counters outside of the App (like in IIS).
Add New Relic, it's available for free as part of the AppHarbor service - https://appharbor.com/addons/newrelic
Since you mention "in the cloud somewhere" are you using Microsoft Azure for hosting? If so, there's some great diagnostics you can log to your Azure storage with DiagnosticsMonitorConfiguration.
Here's a tutorial on how to add diagnostics to your web and worker roles. You can find a full list of performance counters on MSDN
You can get everything from application requests/second, memory and CPU utilization, network adapter statistics, output cache hits/misses, request execution time, etc.

Resources