The Altitude log.txt file is an absolute pain to go through and see what is happening on the server to figure out why your code might not be working right. Since it is optimized for computers, and not people, I finally broke down and wrote a script to make the logs easy to use for troubleshooting. It is located with my alti+server code, but does not need any Alti+ stuff, it is 100% stand-alone. All you need is perl with the JSON::XS and HTTP::Date modules. It is called altalyzer, and it is here:
https://github.com/biell/alti-server...ster/altalyzer
What this script does is to create a CSV file (a table, really) where every possible log entry is its own column, and everything is always displayed in the correct column. Now, data like "time" will always be lined up. I also augment rows by filling in nicknames and vaporIds when I can figure out the right value, so you don't have to.
It takes any number of log files on the command line (I suggest listing them in chronological order) and it will produce a CSV file to stdout (I recommend redirecting that output to a file). Once you have the CSV file, load it up in Google Sheets, LibreOffice, Excel, etc. I recommend freezing the first row and making the entire sheet a filter table. After that, you can filter only the rows you need based on the various colums.
Using your spreadsheet's built-in filtering will allow you to easilly narrow the data down to just the rows you need. You can also hide or delete the columns you don't need if that helps get all the data to fit on one screen.