A Heroku log would look something like:
2021-07-28T13:26:07.189019+00:00 heroku[web.1]:hello world
Is it possible to change the time format so that my human brain doesn't need to do math and can understand it directly? Something like 1:26:07 PM 2021/07/28 or something similar would be nice.
While I'm on this topic, this may be a obviously answered question but is it possible for Heroku to not delete my logs when I close the website?
This is not configurable:
Timestamp - The date and time recorded at the time the log line was produced by the dyno or component. The timestamp is in the format specified by RFC5424, and includes microsecond precision.
However, Heroku logs aren't really meant to be your primary log store. As you have noticed, they don't store very much data, or retain it very long:
Logplex is designed for collating and routing log messages, not for storage. It retains the most recent 1,500 lines of your consolidated logs, which expire after 1 week.
For more production-ready persistence of logs, you can add one of the Heroku platform's available logging add-ons to your app. Most of these add-ons offer a free plan to get started.
Alternatively, you can implement your own log drains for full control over what happens to your logs.
Many of the third-party log storage options will let you configure how timestamps are displayed. For example, here's some documentation from Papertrail about how to use a more convenient time zone.
I suggest you start by reviewing the available logging addons. Many of them have free tiers. Then pick one or two and try them out.
Related
I see how to view heroku logs live
heroku logs -t
and how to view the last n logs e.g.
heroku logs -n 500
would show the last 500 lines of logs. But the n can only be so large because heroku simply doesn't store them (forever). For example, I tried heroku logs -n 5000 and I get about 1500 lines of logs, but no more.
So suppose I wanted to be able to view logs from farther back, what would I have to do to make them available? Is there some heroku setting/addon I need to implement in order for them to be permanently made available? Or is it the application layer's responsibility to store logs somewhere (e.g. persist them in a database)?
Heroku logs retain the last 1500 logging records, that are max 1 week old.
There are several add-ons you can use on Heroku, however they will all have some limitations (max logs, max days) when using a Free Plan.
Papertrail is a good solution I can recommend: nice dashboard for queries and alerting, saved searches, Free plan including the last 7 days logs. When purchasing a plan (starting from 8$ p/month) the limitations are 'relaxed' and you can keep logs for up to 1 year.
Not an answer, but just adding some advice I received in other forums:
Papertrail is awesome, it's a log management app (nothing to do with the ruby gem of the same name)
Some suggest not worrying about collecting all logs, but simply using something like Sentry so the errors are reported, which saves sifting through all the logs.
I am running a job on my Heroku app that generates about 300k lines of log within 5 minutes. I need to extract all of them into a file. How can I do this?
The Heroku UI only shows logs in real time, since the moment it was opened, and only keeps 10k lines.
I attached a LogDNA Add-on as a drain, but their export also only allows 10k lines export. To even have the option of export, I need to apply a search filter (I typed 2020 because all the lines start with a date, but still...). I can scroll through all the logs to see them, but as I scroll up the bottom gets truncated, so I can't even copy-paste them myself.
I then attached Sumo Logic as a drain, which is better, because the export limit is 100k. However I still need to filter the logs in 30s to 60s intervals and download separately. Also it exports to CSV file and in reverse order (newest first, not what I want) so I have to still work on the file after its downloaded.
Is there no option to get actual raw log files in full?
Is there no option to get actual raw log files in full?
There are no actual raw log files.
Heroku's architecture requires that logging be distributed. By default, its Logplex service aggregates log output from all services into a single stream and makes it available via heroku logs. However,
Logplex is designed for collating and routing log messages, not for storage. It retains the most recent 1,500 lines of your consolidated logs, which expire after 1 week.
For longer persistence you need something else. In addition to commercial logging services like those you mentioned, you have several options:
Log to a database instead of files. Something like Apache Cassandra might be a good fit.
Send your logs to a logging server via Syslog (my preference):
Syslog drains allow you to forward your Heroku logs to an external Syslog server for long-term archiving.
Send your logs to a custom logging process via HTTPS.
Log drains also support messaging via HTTPS. This makes it easy to write your own log-processing logic and run it on a web service (such as another Heroku app).
Speaking solely from the Sumo Logic point of view, since that’s the only one I’m familiar with here, you could do this with its Search Job API: https://help.sumologic.com/APIs/Search-Job-API/About-the-Search-Job-API
The Search Job API lets you kick off a search, poll it for status, and then when complete, page through the results (up to 1M records, I believe) and do whatever you want with them, such as dumping them into a CSV file.
But this is only available to trial and Enterprise accounts.
I just looked at Heroku’s docs and it does not look like they have a native way to retrieve more than 1500 and you do have to forward those logs via syslog to a separate server / service.
I think your best solution is going to depend, however, on your use-case, such as why specifically you need these logs in a CSV.
I can see the logs for a heroku app with heroku logs -t
Is there a way to easily see how many hits an app has received in, say, the past 24 hours? (preferably using a quick command in the CLI, but otherwise through the heroku webisite)
There are two ways I see here
the Heroku dashboard provides you with a Metrics tab, where you can see the throughput of your application.
If this is not exact enough, you can add a logging addon (logentries for example), then then analyze the router-logs there. Logentries provides you with counting, grouping, etc.
Same as a logging addon, but you also can add your own log drain and then analyze them yourself :)
When trying to send a push alert through Parse.com, I came across the following warning:
Installations without a known timezone will not receive this campaign.
So, how do I make sure Parse knows a user's timezones? Is there any specific code, or does it to that without the need for code/by default, and this is a moot question?
Thanks!
This is recorded on a per-Installation basis by the Parse library, and should be automatically updated whenever it is updated by the client.
You can verify that Parse is saving time zones by logging into your account, selecting 'Core' (at top) and 'Installation' (at left). You'll see a list of all current installations - the relevant column is timeZone.
The notice that you see when attempting to send a push is just a general reminder, not an indication that there is necessarily anything wrong on your end.
UPDATE 3/1/2015:
I found a bug in this, BTW, that some people might run into. So I'm posting it here in case it might help somebody.
In the current version of Parse, there is a bug wherein an iOS device with their Date & Time "Set Automatically" setting disabled will (potentially) return a timeZone that Parse won't understand. In such an event, local-time scheduled push notifications will not be sent to a user with that setting turned off.
I verified this, myself, on two devices. With "Set Automatically" turned on the Parse Installation is set to "America/Los Angeles" (which is accurate for me). With it turned off, it sets it to "US/Pacific". This is still accurate, obviously, but for some reason Parse does not like that value.
I imagine there are a non-trivial number of iOS users with that setting disabled, so I hope Parse fixes this.
I have an application that needs to stay in sync with google drive. To that end, I'm using the Changes feed that is described on this page.
I know the idea is to poll the changes feed so that I don't have to request a list of files and do a comparison. Right now I have it set to query every 30 seconds, and initiate a sync operation when the latest change number is updated. But, to make the application feel a little more responsive, I would like to query API more frequently (but still initiate a sync only when necessary)
Given that, I was wondering if requests against the Changes feed count toward the API quota? I don't want to query more frequently if it's going to double my quota consumption rate.
It looks like requests to the changes API do count toward the quota. I found the Reports section in the cloud console. It gives a detailed breakdown of requests by user, location, method, and more. Looking through the methods, I found that drive.changes.list accounts for the majority of my usage.
It's unfortunate, but better than burning through the quota with multiple calls to get the status of every file.