Shallow Thoughts

Akkana's Musings on Open Source Computing and Technology, Science, and Nature.

Mon, 15 Apr 2019

Making a Land Ownership overlay: Categorized Styles in QGIS

Now that I know how to make a map overlay for OsmAnd, I wanted a land ownership overlay. When we're hiking, we often wonder whether we're on Forest Service, BLM, or NPS land, or private land, or Indian land. It's not easy to tell.

Finding Land Ownership Data

The first trick was finding the data. The New Mexico State Land Office has an interactive New Mexico Land Status map, but that's no help when walking around, and their downloadable GIS files only cover the lands administered by the state land office, which mostly doesn't include any areas where we hike. They do have some detailed PDF maps of New Mexico Lands if you have a printer capable of printing enormous pages, which most of us don't.

In theory I could download their 11" x 17" Land Status PDF, convert it to a raster file, and georeference it as I described in the earlier article; but since they obviously have the GIS data (used for the interactive map) I'd much rather download the data and save myself all that extra work.

Eventually I found New Mexico ownership data at UNM's RGIS page, which has an excellent collection of GIS data available for download. Click on Boundaries, then download Surface Land Ownership. It's available in a variety of formats; I chose the geojson format because I find it the most readable and the easiest to parse with Python, though ESRI shapefiles arguably might have been easier in QGIS.

Colorizing Polygons in QGIS

You can run qgis on a geojson file directly. When it loads it shows the boundaries, and you can use the Info tool to click on a polygon and see its metadata -- ownership might be BLM, DOE, FS, I, or whatever. But they're all the same color, so it's hard to get a sense of land ownership just clicking around.

[QGIS categorized layers] To colorize the polygons differently, right-click on the layer name and choose Properties. For Style, choose Categorized. For Column, pick the attribute you want to use to choose colors: for this dataset, it's "own", for ownership.

Color ramp is initially set to random. Click Classify to generate an initial color ramp, then click Apply to see what it looks like on the map.

Then you can customize the colors by doubleclicking on specific color swatches. For instance, by unstated convention most maps show Forest Service land as green, BLM and Indian land as various shades of brown. Click Apply as you change colors, until you're happy with the result.

Exporting to GeoTIFF

You can export the colored layer to GeoTIFF using QGIS' confusing and poorly documented Print Composer. Create one with: Project > New Print Composer, which will open with a blank white canvas.

Zoom and pan in the QGIS window so the full extent of the image you want to export is visible. Then, in the Print Composer, Layout > Add Map. Click and drag in the blank canvas, going from one corner to the opposite corner, and some portion of the map should appear.

There doesn't seem to be any way to Print Composer to import your whole map automatically, or for you to control what portion of the map from the QGIS window will show up in the Print Composer when you drag. If you guess wrong and don't get all of your map, hit Delete, switch to the QGIS window and drag and/or zoom your map a little, then switch back to Print Composer and try adding it again.

You can also make adjustments by changing the Extents in the Item Properties tab, and clicking the Set to map canvas extent button in that tab will enlarge your extents to cover approximately what's currently showing in the QGIS window.

It's a fiddly process and there's not much control, but when you decide it's close enough, Composer > Export as Image... and choose TIFF format. (Print Composer offers both TIFF and TIF; I don't know if there's a difference. I only tried TIFF with two effs.) That should write a GeoTIFF format; to verify that, go to a terminal and run gdalinfo on the saved TIFF file and make sure it says it's GeoTIFF.

Load into OsmAnd

[Land ownership overlay in OsmAnd] Finally, load the image into OsmAnd's tiles folder as discussed in the previous article, then bring up the Configure map menu and enable the overlay.

I found that the black lines dividing the various pieces of land are a bit thicker than I'd like. You can't get that super accurate "I'm standing with one foot in USFS land and the other foot in BLM land" feeling because of the thick black DMZ dividing them. But that's probably just as well: I suspect the data doesn't have pinpoint accuracy either. I'm sure there's a way to reduce the thickness of the black line or eliminate it entirely, but for now, I'm happy with what I have.

Tags: , ,
[ 18:13 Apr 15, 2019    More mapping | permalink to this entry | comments ]

Wed, 10 Apr 2019

Making Overlay Maps for OsmAnd on Linux

For many years I've wished I could take a raster map image, like a geology map, an old historical map, or a trail map, and overlay it onto the map shown in OsmAnd so I can use it on my phone while walking around. I've tried many times, but there are so many steps and I never found a method that worked.

Last week, the ever helpful Bart Eisenberg posted to the OsmAnd list a video he'd made: Displaying web-based maps with MAPC2MAPC: OsmAnd Maps & Navigation. Bart makes great videos ... but in this case, MAPC2MAPC turned out to be a Windows program so it's no help to a Linux user. Darn!

But seeing his steps laid out inspired me to try again, and gave me some useful terms for web searching. And this time I finally succeeded. I was also helped by a post to the OsmAnd list by A Thompson, How to get aerial image into offline use?, though I needed to change a few of the steps. (Note: click on any of the screenshots here to see a larger version.)

Georeference the Image Using QGIS

The first step is to georeference the image: turn the plain raster image into a GeoTiff that has references showing where on Earth its corners are. It turns out there's an open source program that can do that, QGIS. Although it's poorly documented, it's fairly easy once you figure out the trick.

I started with the tutorial Georeferencing Basics, but it omits one important point, which I finally found in BBRHUFT's How to Georeference a map in QGIS. Step 11 is the key: the Coordinate Reference System (CRS) must be the same in the georeferencing window as it is in the main QGIS window. That sounds like a no-brainer, but in practice, the lists of possible CRSes shown in the two windows don't overlap, so unless you follow BBRHUFT's advice and type 3857 into the filter box in both windows, you'll likely end up with CRSes that don't match. It'll look like it's working, but the resulting GeoTiff will have coordinates nowhere near where they should be

Instead, follow BBRHUFT's advice and type 3857 into the filter box in both windows. The "WGS 84 / Pseudo Mercator" CRS will show up and you can use it in both places. Then the GeoTiff will come out in the right place.

If you're starting from a PDF, you may need to convert it to a raster format like PNG or JPG first. GIMP can do that.

So, the full QGIS steps are:


Convert the GeoTiff to Map Tiles

The ultimate goal is to convert to OsmAnd's sqlite format, but there's no way to get there directly. First you have to convert it to map tiles in a format called mbtiles.

QGIS has a plug-in called QTiles but it didn't work for me: it briefly displayed a progress bar which then disappeared without creating any files. Fortunately, you can do the conversion much more easily with gdal_translate, which at least on Debian is part of the gdal-bin package.

gdal_translate filename.tiff filename.mbtiles

That will create tiles for a limited range of zoom levels (maybe only one zoom level). gdalinfo will tell you the zoom levels in the file. If you want to be able to zoom out and still see your overlay, you might want to add wider zoom levels, which you can do like this:

gdaladdo -r nearest filename.mbtiles 2 4 8 16

Incidentally, gdal can also create a directory of tiles suitable for a web slippy map, though you don't need that for OsmAnd. For that, use gdal2tiles, which on Debian is part of the python-gdal package:

mkdir tiles
gdal2tiles filename.tiff tiles

Not only does it create tiles, it also includes multiple HTML files you can use to display those tiles using the Leaflet, OpenLayers or Google Maps JavaScript libraries. Very cool!

Create the OsmAnd sqlite file

Tarwirdur has written a nice simple Python script to translate from mbtiles to OsmAnd sqlite: mbtiles2osmand.py. Download it then run

mbtiles2osmand.py filename.mbtiles filename.sqlitedb

So easy to use! Most of the other references I saw said to use Mobile Atlas Creator (MOBAC) and that looked a lot more complicated.

Incidentally, Bart's video says MAPC2MAPC calls the format "Locus/Rmaps/Galileo/OSMAND (sqlite)", which might be useful to know for web search purposes.

Install in OsmAnd

[Georeferenced map overlay in OsmAnd] Once you have the .sqlitedb file, copy it to OsmAnd's tiles folder in whatever way you prefer. For me, that's adb push file.sqlitedb $androidSD/Android/data/net.osmand.plus/files/tiles where $androidSD is the /storage/whatever location of my device's SD card.

Then start OsmAnd and tap on the icon in the upper left for your current mode (car, bike, walking etc.) to bring up the Configure map menu. Scroll down to Overlay or Underlay map, enable one of those two and you should be able to choose your newly installed map.

You can adjust the overlay's transparency with a slider that's visible at the bottom of the map (the blue slider just above the distance scale), so you can see your overlay and the main map at the same time.

The overlay disappears if you zoom out too far, and I haven't yet figured out what controls that; I'm still working on those details.

Sure, this process is a lot of work. But the result is worth it. Check out the geologic layers we walked through on a portion of a recent hike in Rendija Canyon (our hike is the purple path).

Tags: , , ,
[ 19:08 Apr 10, 2019    More mapping | permalink to this entry | comments ]

Thu, 28 Mar 2019

Antler in the Yard

We see mule deer often enough that I've wondered if we would ever find a shed antler in the yard.

[Shed antler] A couple of days ago Dave found one. It's just a 4-pointer, but it's a big 4-pointer. We're still hoping its mate might be somewhere nearby, but no luck so far.

It feels enormously heavy, though the scale says it's just shy of two pounds. Still, carrying around four pounds on your head all the time ... sounds like a recipe for a headache.

The day after the antler turned up, five bucks visited our garden in the evening. Four of them still had substantial buttons where antlers had recently been. The fifth still had antlers (a ten-pointer).

I have to wonder, what do they think about that? Is the guy who still has antlers the macho king of the yard? Or are the other four saying "Gosh, I'm so glad mine dropped last week, so sorry you still have to carry that rack around"?

Tags:
[ 15:51 Mar 28, 2019    More nature | permalink to this entry | comments ]

Sat, 23 Feb 2019

Natural Topiary

Some months after we moved into our new house, we came home to notice a really ugly job of tree trimming on some junipers in the driveway.

[deer topiary] We hadn't done it, nor had we requested tree trimming service. Yet we'd been in the house for months, and we were both pretty sure that the junipers hadn't looked like that before. They were nearly denuded at the bottom, belling out higher up.

Was there some rogue topiarist wandering the neighborhood, defoliating people's trees without asking them? When I was growing up, occasionally my grandmother would show up and slash branches off trees in our backyard without asking. But it made no sense that anyone in our neighborhood would do that.

We hoped the trees would grow back to their former bushiness, but several years later, they don't look much better. And all that time it has remained a mystery how the trees came to look like that.

Until a party a few months ago, when a visiting friend cast a knowing look at the trees and said, "I see the deer have been visiting you."

[deer topiary] Of course! Somehow that had never occurred to us. We went out to the driveway and checked -- and sure enough, the trimmed parts of the trees go up to roughly the height a deer could easily reach.

So sure enough -- there is a rogue topiarist wandering the neighborhood after all. Lots of them, in fact. Most of the time they eat more tempting fare; but when the weather gets inclement and there isn't much to eat, they'll go after the junipers. And I guess these non-native junipers in the driveway are a little tastier than the native ones that are all around the property.

We don't feel quite so resentful about our unwanted tree trimming now. Sure, we're still not crazy about the look of our oddly shaped junipers, and they've gotten even worse during the current exceptionally snowy winter; but now that we know it's a natural process, not some crazed shear-wielding neighbor, it's hard to be too upset by it.

[ 11:35 Feb 23, 2019    More nature | permalink to this entry | comments ]

Tue, 12 Feb 2019

Reading the Output of a Weather Station Using Software Defined Radio

[Weather station] A while back, Dave ordered a weather station. His research pointed to the Ambient Weather WS-2000 as the best bang for the buck as far as accuracy (after it's calibrated, which is a time consuming and exacting process where you compare readings to a known-good mercury thermometer, a process that I suspect most weather station owners don't bother with).

It comes with a little 7" display console that sits indoors and reads the radio signal from the outside station as well as a second thermometer inside, then displays all the current weather data. It also uses wi-fi to report the data upstream to Ambient and, optionally, to a weather site such as Wunderground. (Which we did for a while, but now Wunderground is closing off their public API, so why give them data if they're not going to make it easy to share it?)

[Weather station console] Having the console readout and the Ambient "dashboard" is all very nice, but of course, being a data geek, I wanted a way to get the data myself, so I could plot it, save it or otherwise process it. And that's where Ambient falls short. The console, though it's already talking on wi-fi, gives you no way to get the data. They sell a separate unit called an "Observer" that provides a web page you can scrape, and we actually ordered one, but it turned out to be buggy and difficult to use, giving numbers that were substantially different from what the console showed, and randomly failing to answer, and we ended up returning the observer for a refund.

The other way of getting the data is online. Ambient provides an API you can use for that purpose, if you email them for a key. It mostly works, but it sometimes lags considerably behind real time, and it seems crazy to have to beg for a key and then get data from a company website that originated in our own backyard.

What I really wanted to do was read the signal from the weather station directly. I'd planned for ages to look into how to do that, but I'm a complete newbie to software defined radio and wasn't sure where to start. Then one day I noticed an SDR discussion on the #raspberrypi IRC channel on Freenode where I often hang out. I jumped in, asked some questions, and got pointed in the right direction and referred to the friendly and helpful #rtlsdr Freenode channel.

An Inexpensive SDR Dongle

[Raspberry Pi with SDR dongle] On the experts' advice, I ordered a RTL-SDR Blog R820T2 RTL2832U 1PPM TCXO SMA Software Defined Radio with 2x Telescopic Antennas on Amazon. This dongle apparently has better temperature compensation than cheaper alternatives, it came with a couple of different antenna options, and I was told it should work well with Linux using a program called rtl_433.

Indeed it did. The command to monitor the weather station is

rtl_433 -f 915M

rtl_433 already knows the protocol for the WS-2000, so I didn't even need to do any decoding or reverse engineering; it produces a running account of the periodic signals being broadcast from the station. rtl_433 also helpfully offers -F json and -F csv options, along with a few other formats. What a great program!

JSON turned out to be the easiest for me to use; initially I thought CSV would be more compact, but rtl_433's CSV format includes fields for every possible quantity a weather station could ever broadcast. When you think about it, that makes sense: once you're outputting CSV you can't add a new field in mid-stream, so you'd better be ready for anything. JSON, on the other hand, lets you report just whatever the weather station reports, and it's easy to parse from nearly any programming language.

Testing the SDR Dongle

Full disclosure: at first, rtl_433 -f 915M wasn't showing me anything and I still don't know why. Maybe I had a loose connection on the antenna, or maybe I got unlucky and the weather station picked the exact wrong time to take a vacation. But while I was testing, I found another program that was very helpful in testing whether my SDR dongle was working: rtl_fm, which plays radio stations. The only trick is finding the right arguments, since the example from the man page just played static. Here's what worked for me:

rtl_fm -f 101.1M -M fm -g 20 -s 200k -A fast -r 32k -l 0 -E deemp | play -r 32k -t raw -e s -b 16 -c 1 -V1 -
That command plays the 101.1 FM radio station. (I had to do a web search to give me some frequencies of local radio stations; it's been a long time since I listened to normal radio.)

Once I knew the dongle was working, I needed to verify what frequency the weather station was using for its broadcasts. What I really wanted was something that would scan frequencies around 915M and tell me what it found. Everyone kept pointing me to a program called Gqrx. But it turns out Gqrx on Linux requires PulseAudio and absolutely refuses to work or install without it, even if you have no interest in playing audio. I didn't want to break my system's sound (I've never managed to get sound working reliably under PulseAudio), and although it's supposedly possible to build Gqrx without Pulse, it's a difficult build: I saw plenty of horror stories, and it requires Boost, always a sign that a build will be problematic. I fiddled with it a little but decided it wasn't a good time investment.

I eventually found a scanner that worked: RTLSDR-Scanner. It let me set limiting frequencies and scan between them, and by setting it to accumulate, I was able to verify that indeed, the weather station (or something else!) was sending a signal on 915 MHz. I guess by then, the original problem had fixed itself, and after that, rtl_433 started showing me signals from the weather station. It's not super polished, but it's the only scanner I've found that works without requiring PulseAudio.

That Puzzling Rainfall Number

One mystery remained to be solved. The JSON I was getting from the weather station looked like this (I've reformatted it for readablility):

{
    "time" : "2019-01-11 11:50:12",
    "model" : "Fine Offset WH65B",
    "id" : 60,
    "temperature_C" : 2.200,
    "humidity" : 94,
    "wind_dir_deg" : 316,
    "wind_speed_ms" : 0.064,
    "gust_speed_ms" : 0.510,
    "rainfall_mm" : 90.678,
    "uv" : 324,
    "uvi" : 0,
    "light_lux" : 19344.000,
     "battery" : "OK",
    "mic" : "CRC"
}

This on a day when it hadn't rained in ages. What was up with that "rainfall_mm" : 90.678 ?

I asked on the rtl_433 list and got a prompt and helpful answer: it's a cumulative number, since some unspecified time in the past (possibly the last time the battery was changed?) So as long as I make a note of the rainfall_mm number, any change in that number means new rainfall.

This being a snowy winter, I haven't been able to test that yet: the WS-2000 doesn't measure snowfall unless some snow happens to melt in the rain cup.

Some of the other numbers, like uv and uvi, are in mysterious unknown units and sometimes give results that make no sense (why doesn't uv go to zero at night? You're telling me that there's that much UV in starlight?), but I already knew that was an issue with the Ambient. It's not rtl_433's fault.

I notice that the numbers are often a bit different from what the Ambient API reports; apparently they do some massaging of the numbers, and the console has its own adjustment factors too. We'll have to do some more calibration with a mercury thermometer to see which set of numbers is right.

Anyway, cool stuff! It took no time at all to write a simple client for my WatchWeather web app that runs rtl_433 and monitors the JSON output. I already had WatchWeather clients collecting reports from Raspberry Pi Zero Ws sitting at various places in the house with temperature/humidity sensors attached; and now my WatchWeather page can include the weather station itself.

Meanwhile, we donated another weather station to the Los Alamos Nature Center, though it doesn't have the SDR dongle, just the normal Ambient console reporting to Wunderground.

Tags: , , ,
[ 13:20 Feb 12, 2019    More tech | permalink to this entry | comments ]

Sun, 03 Feb 2019

A Spectacular Lenticular

I was hurrying to check in some tweaks to the BillTracker before leaving to join some friends for dinner when I noticed some beautiful clouds over the Sangre de Cristos.

"That's a spectacular lenticular!" I exclaimed, grabbing the camera.

[Spectacular lenticular cloud]

The clouds got even better ten minutes later as the sunset turned the clouds salmon-pink, but alas by then we were pulling up to a friend's driveway and didn't have a clear view, and by the time I did, I'd lost the light.

When I offloaded the photos from the camera's SD card this morning to see how the photos came out, I found some older photos from our snowstorm a few weeks ago. In particular, a photo of that curly dragon-droop glacier above the den deck after it fell. Before and after:

[] []

Tags: , , ,
[ 17:58 Feb 03, 2019    More | permalink to this entry | comments ]

Fri, 25 Jan 2019

Announcing the New Mexico Bill Tracker

For the last few weeks I've been consumed with a project I started last year and then put aside for a while: a bill tracker.

The project sprung out of frustration at the difficulty of following bills as they pass through the New Mexico legislature. Bills I was interested in would die in committee, or they would make it to a vote, and I'd read about it a few days later and wish I'd known that it was a good time to write my representative or show up at the Roundhouse to speak. (I've never spoken at the Roundhouse, and whether I'd have the courage to actually do it remains to be seen, but at least I'd like to have the chance to decide.)

New Mexico has a Legislative web site where you can see the status of each bill, and they even offer a way to register and save a list of bills; but then there's no way to get alerts about bills that change status and might be coming up for debate.

New Mexico legislative sessions are incredibly short: 60 days in odd years, 30 days in even. During last year's 30-day session, I wrote some Python code that scraped the HTML pages describing a bill, extract the useful information like when the bill last changed status and where it was right now, present the information in a table where the user could easily scan it, and email the user a daily summary. Fortunately, the nmlegis.gov site, while it doesn't offer raw data for bill status, at least uses lots of id tags in its HTML which make them relatively easy to scrape.

Then the session ended and there was no further way to test it, since bills' statuses were no longer changing. So the billtracker moved to the back burner.

In the runup to this year's 60-day session, I started with Flask, a lightweight Python web library I've used for a couple of small projects, and added some extensions that help Flask handle tasks like user accounts. Then I patched in the legislative web scraping code from last year, and the result was The New Mexico Bill Tracker. I passed the word to some friends in the League of Women Voters and the Sierra Club to help me test it, and I think (hope) it's ready for wider testing.

There's lots more I'd like to do, of course. I still have no way of knowing when a bill will be up for debate. It looks like this year the Legislative web site is showing committ schedules in a fairly standard way, as opposed to the unparseable PDFs they used in past years, so I may be able to get that. Not that legislative committees actually stick to their published schedules; but at least it's a start.

New Mexico readers (or anyone else interested in following the progress of New Mexico bills) are invited to try it. Let me know about any problems you encounter. And if you want to adapt the billtracker for use in another state, send me a note! I'd love to see it extended and would be happy to work with you. Here's the source: BillTracker on GitHub.

Tags: , , ,
[ 12:34 Jan 25, 2019    More politics | permalink to this entry | comments ]

Fri, 18 Jan 2019

Python: Find Your System's Biggest CPU Hogs

My machine has recently developed an overheating problem. I haven't found a solution for that yet -- you'd think Linux would have a way to automatically kill or suspend processes based on CPU temperature, but apparently not -- but my investigations led me down one interesting road: how to write a Python script that finds CPU hogs.

The psutil module can get a list of processes with psutil.process_iter(), which returns Process objects that have a cpu_percent() call. Great! Except it always returns 0.0, even for known hogs like Firefox, or if you start up a VLC and make it play video scaled to the monitor size.

That's because cpu_percent() needs to run twice, with an interval in between: it records the elapsed run time and sees how much it changes. You can pass an interval to cpu_percent() (the units aren't documented, but apparently they're seconds). But if you're calling it on more than one process -- as you usually will be -- it's better not to wait for each process. You have to wait at least a quarter of a second to get useful numbers, and longer is better. If you do that for every process on the system, you'll be waiting a long time.

Instead, use cpu_percent() in non-blocking mode. Pass None as the interval (or leave it blank since None is the default), then loop over the process list and call proc.cpu_percent(None) on each process, throwing away the results the first time. Then sleep for a while and repeat the loop: the second time, cpu_percent() will give you useful numbers.

def hoglist(delay=5):
    '''Return a list of processes using a nonzero CPU percentage
       during the interval specified by delay (seconds),
       sorted so the biggest hog is first.
    '''
    proccesses = list(psutil.process_iter())
    for proc in proccesses:
        proc.cpu_percent(None)    # non-blocking; throw away first bogus value

    print("Sleeping ...")
    sys.stdout.flush()
    time.sleep(delay)
    print()

    procs = []
    for proc in proccesses:
        percent = proc.cpu_percent(None)
        if percent:
            procs.append((proc.name(), percent))

    print(procs)
    procs.sort(key=lambda x: x[1], reverse=True)
    return procs

if __name__ == '__main__':
    prohogscs = hoglist()
    for p in hogs:
        print("%20s: %5.2f" % p)

It's a useful trick. Though actually applying this to a daemon that responds to temperature, to solve my overheating problem, is more complicated. For one thing, you need rules about special processes. If your Firefox goes wonky and starts making your X server take lots of CPU time, you want to suspend Firefox, not the X server.

Tags: , ,
[ 15:54 Jan 18, 2019    More programming | permalink to this entry | comments ]