I am getting started on Heroku and concerned whether it will charge me for a mistake I do. For example, filling up the database over 5MB.
So, is there a way to set billing limit to an Heroku or notification system to send a notification when I go over the price limit?
Thank You in Advance!
Dont think so. But heroku only bills u if you actually signed up for something, like an addon, that is non free. If you use up 5 megs of database, they would send u an email telling you that you are over the limit, and would simply stop all INSERT operations to the database.
Another thing to note is not to leave heroku console on longer than necessary, since that requires another worker and is billable.
All limits are soft limits. The new row limits imposed on the database plans if exceeded will trigger an email notification for you to take action.
Related
I am using google smtp to send a lot of mails, but if I send too much mails it will return quota error.
So I want to check my quota anytimes.
But I didn't find any about this.
Is there any way like API to do this?
There is no way to check your quota via an api. You can check google developer console for your project manually but this is an estimate at the very best.
My advice is to keep a running total of the requests you are sending and try to space them out so that you don't hit any of the quotas.
Once you hit a quota error makes sure you stop and take a break for a little while, possibly implementing exponential backoff, but this depends upon which quota you are hitting.
I have a Heroku server with about $250.00 worth of monthly addons (due to upgrades Heroku Postgres and Heroku Redis). I'm no longer using the server for the foreseeable future, but would like to be able to boot the server back up at a later date with the same configuration.
Is there a way to temporarily halt all server functionality to prevent myself from getting billed, with the possibility of rebooting the server at a later date?
Well, you can step down the dynos to hobby-dev tier if you've less than 2 process types. Or you can simply shut them down. Just go to https://dashboard.heroku.com/, click on your app and then go to the 'resources' tab to control the dynos.
Stepping down heroku-redis should be easy too. It's anyway temporary storage, that you can restart/scale up later. Also see this
The only sticking point might be your Postgres DB. If it has more than 10,000 rows, you'll have to pay atleast $9 per month, and if you've more than 1Mn rows in the DB, you'll have to pay atleast $50 per month. Many times DBs collect a lot of logs data. You can consider cleaning and compacting the data if that's possible. Or you can take a local Database dump and decommission the DB and when you decide to start the app again upload the DB (this is a bit of an extreme step though, so be doubly sure that you've everything backup up.)
I'm attempting to consume the Paypal API transaction endpoint.
I want to grab ALL transactions for a given account. This number could potentially be in the 10's of millions of transactions. For each of these transactions, I need to store it in the database for processing by a queued job. I've been trying to figure out the best way to pull this many records with Laravel. Paypal has a max request items limit of 20 per page.
I initially started off with the idea of creating a job when a user gives me their API credentials that gets the first 20 items and processes them, then dispatches a job from the first job that contains the starting index to use. This would loop forever until it errored out. This doesn't seem to be working well though as it causes a gateway timeout on saving those API credentials and the request to the API eventually times out (before getting all transactions). I should also mention that the total number of transactions is unknown, so chaining doesn't seem to be the answer as there is no way to know how many jobs to dispatch...
Thoughts? Is getting API data best suited for a job?
Yes job is way to go . I’m not familiar with paypal api but it’s seems requests are rate limited paypal rate limiting.. you might want to delay your api requests a bit.. also you can make a class to monitor your api requests consumption by tracking the latest requests you made and in the job you can determine when to fire the next request and record it in the database...
My humble advise
please don’t pull all the data your database will get bloated quickly and you’ll need to scale each time you have a new account it’s not easy task.
You could dispatch the same job at the end of the first job which queries your current database to find the starting index of the transactions for that job.
So even if your job errors out, you could dispatch it again, then it will resume from where it was ended previously
May be you will need link your app with another data engine like AWS, anyway I think the best idea is creating an APi, pull only the most important data, indexed, and keep the all big data in another endpoint, where you can reach them if you need
Can anyone please explain why Magento runs order email in cronjob?
I set up cronjob to send email every 5 minutes.
Is there any issue if I switch to instant sending the customer an instant email confirmation?
My customer asks why he can't receive the order confirmation instantly.
Can anyone please explain why Magento runs order email in cronjob?
Well their changelogs don't really explain why, but generally the reasons for moving processes to a cron job are:
It goes from synchronous to asynchronous
The processing time doesn't matter as much
The web server doesn't need to handle it (timeouts may not be relevant, memory limits may be larger, interference with the web server pool may be lessened)
I set up cronjob to send email every 5 minutes. Is there any issue if I switch to instant sending the customer an instant email confirmation?
Not really, no. Other than that it would be a regression in Magento capability. If you take the checkout process for example, when you place your order there is a variety of things that happen; save quote, convert quote to order, prepare payment, capture payment, create invoice, save everything, etc... In this case they've taken the time it takes to generate and send the order email out of this process to improve the checkout speed.
Yes - you can put it back to being sent instantly if you'd like, but my suggestion to you would just be to run your cron every minute instead of every five minutes.
Generally you should employ a rule of "try not to touch core Magento functionality unless you have to.". Hope this helps!
We have a self-hosted content management system which offers clients the ability to sync data from their GA profile with the content they maintain within the application. So, we're dealing with an arbitrary number of clients who can have an arbitrary number of profiles. Each client is required to simply enter their profile ID and request a token.
The issue we are running into is our Google Developer project is hitting its 50k/day quota pretty quickly, but Google has been unresponsive with our requests to increase this quota. Does anyone have experience with requesting an increase? If so, about how long did it take?
Aside from a quota increase, does anyone have suggestions on how we can avoid this situation?
It can take several weeks to get the quota set up. That is why it is recommended that you keep a check on your quota and request when you hit 80% of max. Only way to avoid it is to tune your requests don't request data you already have again store it in your system. if its an existing customer you should only be requesting the last few days worth of data. Everything else is static and shouldn't change.
How long ago did you request it and how critical is your problem?