Me and the mainframe

I recently wrote a sponsored blog for VirtualZ Computing, a startup involved in innovative mainframe software. As I was writing the post, I was thinking about the various points in my professional life where I came face-to-CPU with IBM’s Big Iron, as it once was called. (For what passes for comic relief, check out this series of videos about selling mainframes.)

My last job working for a big IT shop came about in the summer of 1984, when I moved across country to LA to work for an insurance company. The company was a huge customer of IBM mainframes and was just getting into buying PCs for its employees, including mainframe developers and ordinary employees (no one called them end users back then) that wanted to run their own spreadsheets and create their own documents. There were hundreds of people working on and around the mainframe, which was housed in its own inner sanctum, raised-floor series of rooms. I wrote about this job here, and it was interesting because it was the last time I worked in IT before switching careers for tech journalism.

Back in 1984, if I wanted to write a program, I had to first create them by typing out a deck of punch cards. This was done at a special station that was the size of a piece of office furniture. Each card could contain instructions for a single line of code. If you made a mistake you had to toss the card and start anew. When you had your deck you would then feed it into a specialized card reader that would transfer the program to the mainframe and create a “batch job” – meaning my program would then run sometime during the middle of the night. I would get my output the next morning, if I was lucky. If I made any typing errors on my cards, the printout would be a cryptic set of error messages, and I would have to fix the errors and try again the next night. Finding that meager output was akin to getting a college rejection letter in the mail – the acceptances would be thick envelopes. Am I dating myself enough here?

Today’s developers probably are laughing at this situation. They have coding environments that immediately flag syntax errors, and tools that dynamically stop embedded malware from being run, and all sorts of other fancy tricks. if they have to wait more than 10 milliseconds for this information, they complain how slow their platform is. Code is put into production in a matter of moments, rather than the months we had to endure back in the day.

Even though I roamed around the three downtown office towers that housed our company’s workers, I don’t remember ever stepping foot in our Palais d’mainframe. However, over the years I have been to my share of data centers across the world. One visit involved turning off a mainframe for Edison Electric Institute in Washington DC in 1993, where I wrote about the experience and how Novell Netware-based apps replaced many of its functions. Another involved moving a data center from a basement (which would periodically flood) into a purpose-built building next door, in 2007. That data center housed more souped-up microprocessor-based servers which would form the beginnings of massive CPU collections that are used in today’s z Series mainframes BTW.

Mainframes had all sorts of IBM gear that required care and feeding, and lots of knowledge that I used to have at my fingertips: I knew my way around the proprietary protocols called Systems Network Architecture and proprietary networking protocols called Token Ring, for example. And let’s not forget that it ran programs written in COBOL, and used all sorts of other hardware to connect things together with proprietary bus-and-tag cables. When I was making the transition to PC Week in the 1980s, IBM was making the (eventually failed) transition to peer-to-peer mainframe networking with a bunch of proprietary products. Are you seeing a trend here?

Speaking of the IBM PC, it was the first product from IBM that was built with spare parts made by others, rather than its own stuff. That was a good decision, and this was successful because you could add a graphics card (the first PCs just did text, and monochrome at that) or extra memory or a modem. Or a adapter card that connected to another cabling scheme (coax) that turned the PC into a mainframe terminal. Yes, this was before wireless networks became useful, and you can see why.

Now IBM mainframes — there are some 10,000 of them still in the wild — come with the ability to run Linux and operate across TCP/IP networks, and about a third of them are running Linux as their main OS. This was akin to having one foot in the world of distributed cloud computing, and one foot back in the dinosaur era. So let’s talk about my client VirtualZ and where they come into this picture.

They created software – mainframe software – that enabled distributed applications to access mainframe data sets, using OpenAPI protocols and database connectors. The data stays put on the mainframe but is available to applications that we know and love such as Salesforce and Tableau.  It is a terrific idea, just like the original IBM PC in that it supports open systems. This makes the mainframe just another cloud-connected computer, and shows that the mainframe is still an exciting and powerful way to go.

Until VirtualZ came along, developers who wanted access to mainframe data had to go through all sorts of contortions to get it — much like what we had to do in the 1980s and 1990s for that matter. Companies like Snowflake and Fivetran made very successful businesses out of doing these “extract, transfer and load” operations to what is now called data warehouses. VirtualZ eliminates these steps, and your data is available in real time, because it never leaves the cozy comfort of the mainframe, with all of its minions and backups and redundant hardware. You don’t have to build a separate warehouse in the cloud, because your mainframe is now cloud-accessible all the time.

I think VirtualZ’s software will usher in a new mainframe era, an era that puts us further from the punch card era. But it shows the power and persistence of the mainframe, and how IBM had the right computer, just not the right context when it was invented, for today’s enterprise data. For Big Iron to succeed in today’s digital world, it needs a lot of help from little iron.

The Cloud-Ready Mainframe: Extending Your Data’s Reach and Impact

(This post is sponsored by VirtualZ Computing)

Some of the largest enterprises are finding new uses for their mainframes. And instead of competing with cloud and distributed computing, the mainframe has become a complementary asset that adds new productivity and a level of cost-effective scale to existing data and applications. 

While the cloud does quite well at elastically scaling up resources as application and data demands increase, the mainframe is purpose-built for the largest-scale digital applications. But more importantly, it has kept pace as these demands have mushroomed over its 60-year reign, and why so many large enterprises continue to use them. Having them as part of a distributed enterprise application portfolio could be a significant and savvy use case, and be a reason for increasing their future role and importance.

Estimates suggest that there are about 10,000 mainframes in use today, which may not seem a lot except that they can be found across the board in more than two-thirds of Fortune 500 companies, In the past, they used proprietary protocols such as Systems Network Architecture, had applications written in now-obsolete coding languages such as COBOL, and ran on custom CPU hardware. Those days are behind us: instead, the latest mainframes run Linux and TCP/IP across hundreds of multi-core  microprocessors. 

But even speaking cloud-friendly Linux and TCP/IP doesn’t remove two main problems for mainframe-based data. First off, many mainframe COBOL apps are their own island, isolated from the end-user Java experience and coding pipelines and programming tools. To break this isolation usually means an expensive effort to convert and audit the code. 

A second issue has to do with data lakes and data warehouses. These applications have become popular ways that businesses can spot trends quickly and adjust IT solutions as their customer’s data needs evolve. But the underlying applications typically require having near real-time access to existing mainframe data, such as financial transactions, sales and inventory levels or airline reservations. At the core of any lake or warehouse is conducting a series of “extract, transform and load” operations that move data back and forth between the mainframe and the cloud. These efforts only transform data at a particular moment in time, and also require custom programming efforts to accomplish.

What was needed was an additional step to make mainframes easier for IT managers to integrate with other cloud and distributed computing resources, and that means a new set of software tools. The first step was thanks to initiatives such as the use of IBM’s z/OS Connect. This enabled distributed applications to access mainframe data. But it continued the mindset of a custom programming effort and didn’t really provide direct access to distributed applications.

To fully realize the vision of mainframe data as equal cloud nodes required a major makeover, thanks to companies such as VirtualZ Computing. They latched on to the OpenAPI effort, which was previously part of the cloud and distributed world. Using this protocol, they created connectors that made it easier for vendors to access real-time data and integrate with a variety of distributed data products, such as MuleSoft, Tableau, TIBCO, Dell Boomi, Microsoft Power BI, Snowflake and Salesforce. Instead of complex, single-use data transformations, VirtualZ enables real-time read and write access to business applications. This means the mainframe can now become a full-fledged and efficient cloud computer. 

VirtualZ CEO Jeanne Glass says, “Because data stays securely and safely on the mainframe, it is a single source of truth for the customer and still leverages existing mainframe security protocols.” There isn’t any need to convert COBOL code, and no need to do any cumbersome data transformations and extractions.

The net effect is an overall cost reduction since an enterprise isn’t paying for expensive high-resource cloud instances. It makes the business operation more agile, since data is still located in one place and is available at the moment it is needed for a particular application. These uses extend the effective life of a mainframe without having to go through any costly data or process conversions, and do so while reducing risk and complexity. These uses also help solve complex data access and report migration challenges efficiently and at scale, which is key for organizations transitioning to hybrid cloud architectures. And the ultimate result is that one of these hybrid architectures includes the mainframe itself.

Distinguishing between news and propaganda is getting harder to do

Social media personalization has turned the public sphere into an insane asylum, where every person can have their own reality. So says Maria Ressa recently, describing the results of a report from a group of data scientists about the US information ecosystem. The authors are from a group called The Nerve that she founded.

I wrote about Ressa when she won the Nobel Peace prize back in 2021, for her work running the Philippine online news site Rappler. She continues to innovate and afflict the comfortable, as the saying goes. She spoke earlier this month at an event at Columbia University where she covered the report’s findings. The irony of the location wasn’t lost on me: this is the same place where students camped out, driven by various misinformation campaigns.

One of the more interesting elements of the report is the crafting of a new seven layer model (no, not that OSI one). This tracks how the online world manipulates us. And starts off with social media incentives which are designed around promoting more outrage and less actual news. This in turn fuels real-world violence, then amplified by efforts of authoritarian-run nations who target Americans and polarize the public sphere even further. The next layer changes info ops into info warfare, feeding more outrage and conflict. The final layer is our elections, aided by the lack of real news and no general agreement on facts.

Their report is a chilling account of the state of things today, to be sure. And thanks to fewer “trust and safety” staff watching the feeds, greater use of AI in online searches by Google and Microsoft, and Facebook truncating actual news in its social feeds and as a result referring less traffic to online news sites, we have a mess on our hands. News media now shares a shrinking piece of the attention pie with independent creators. The result is that “Americans will have fewer facts to go by, less news on their social media feeds, and more outrage, fear, and hate.” This week it has reached a fever pitch, and I wish I could just turn it all off.

The report focuses on three issues that have divided us, both generationally and politically: immigration, abortion, and the Israel/Hamas war. It takes a very data-driven approach. For example, the number of #FreePalestine hashtag views on TikTok outnumber #StandWithIsrael by 446M to 16M and on Facebook and Twitter the ratio is 18 and 32 times respectively. The measurement periods for each network varies, but you get the point.

The report has several conclusions. First, personalized social media content has formed echo chambers that are fed by hyper-partisan sources which blurs news and propaganda, Journalism and source vetting is becoming rarer, and local TV news is being remade as they compete with cable outrage channels. As more of our youth engage further in social media, they become more vulnerable to purpose-fed disinformation and manipulation, and less able to distinguish between news and propaganda too. And this generational divide continues to widen as the years pass.

Remember when the saying went, if you aren’t paying for the service, you are the product? That seems so naïve now. Social media is now a tool of geopolitics, and gone are the trust and safety teams that once tried to filter the most egregious posts. And as more websites deploy AI-based chatbots, you don’t even know if you are talking to humans. This just continues a trend about the worsening of internet platforms that Cory Doctorow wrote about almost two years ago (he used a more colorful term).

In her address to the Nobel committee back in 2021, Ressa said, “Without facts, you can’t have truth. Without truth, you can’t have trust. Without trust, we have no shared reality and no democracy.”

Book review: GenAI for Dummies by Pam Baker

Pam Baker has written a very useful resource for AI beginners and experts alike. Don’t let the “Dummies” title fool you into thinking otherwise. This is also a book that is hard to get your hands around – in that respect it mirrors what GenAI itself is like. Think of it as a practical tutorial into how to incorporate GenAI into your working life to make you a more productive and potent human. It is also not a book that you can read in some linear front-to-back sense: there are far too many tips, tricks, strategies and things to think about as you move through your AI journey. But it is a book that is absolutely essential, especially if you have been frustrated at learning how to better use AI.

Underlying it all is Baker’s understanding on what the winning formula for using GenAI is – to understand that the output from the computer sounds like a human. But to be really effective, the human must think like a machine and tell GenAI what you want with better prompt engineering. (She spends an entire chapter on that subject with lots of practical suggestions that combine the right mix of clarity, context and creativity.  And so you will find out there is a lot more depth to this than you think.) “You must provide the vision, the passion, and the impetus in your prompts,” Baker writes. Part of that exploration is understanding how to best collaborate with GenAI. To that end, she recommends starting with a human team to work together as moderators in crafting prompts and refining the results from the GenAI tool.” The more information the AI has, the more tailored and sophisticated the outputs will be,” she writes.

To that end, Baker used this strategy to create this very book and was the first such effort for its publisher Wiley. She says it took about half the time to write this, when compared to other books that she has written on technology. This gives the book a certain verisimilitude and street cred. This doesn’t mean ripping the output and setting it in type: that would have been a disaster. Instead, she used AI to hone her research and find sources, then go to those citations and find out if they really exist, adding to her own knowledge along the way. “It really sped up the research I needed to do in the early drafts,” she told me. “I still used it to polish the text in my voice. And you still need to draft in chunks and be strategic about what you share with the models that have a public internet connection, because I didn’t want my book to be incorporated into some model.” All of this is to say that you should use AI to figure out what you do best and that will narrow down the most appropriate tools to use to eliminate the more tedious parts.

Baker makes the point that instead of wasting effort on trying to use GenAI to automate jobs and cut costs, we should use it to create rather than eliminate. It is good advice, and her book is chock full of other suggestions on how you can find the sweet spot for your own creative expressions. She has an intriguing section on how to lie to the model to help improve your own productivity, what she calls a “programming correction.” The flip side of this is also important, and she has several suggestions on how to prevent models from generating false information.

She catalogs the various GenAI vendors and their GenAI tools into how they craft different text, audio and visual outputs, and then summarizes several popular uses, such as in generating photorealistic artworks from text descriptions, some of which she has included in this book. She also explodes several AI myths, such as AI will take over the world or lead towards massive unemployment. She has several recommendations on how to stay on top of AI developments, including continuously upskill your knowledge and tools, become more agile to embrace changes in the field, have the right infrastructure in place to run the models, and keep on top of ethical considerations for its use.

By way of context, I have known Baker for decades. We were trying to figure out when we first began working together, and both our memories failed us. Over time, one of us has worked for the other in numerous projects, websites and publications. She is an instructor for LinkedIn Learning, and has written numerous books, including another “Dummies” book on ChatGPT.

Behind the scenes at my local board of elections

If you have concerns about whether our elections will be free and fair, I suggest you take some time and visit your local elections board and see for yourself how they operate and ask your questions and air your concerns. That is what I did, and I will tell you about my field trip in a moment. I came away thinking the folks that staff this office are the kind of public servants that demand our respect for doing a very difficult job, and doing it with humor, grace, and a sense of professionalism that usually doesn’t get recognized. Instead, these folks are vilified and targeted by conspiracies that have no place in our society.

First, some background. I wrote in August 2020 about the various election security technologies that were being planned for the 2020 election here. And followed up with another blog in December 2020 with the results that those elections were carried out successfully and accurately, along with another blog written last August about further insights about election security gleaned from the Black Hat trade show.

A few weeks ago, I was attending a local infosec conference in town and got to hear Eric Fey, who is one of the directors of the St. Louis County election board. He spoke about ways they are securing the upcoming election. The county is the largest by population in the state, home to close to a million people.

He offered to give anyone at the conference a tour of his offices to see firsthand how they work and what they do to run a safe, secure and accurate election.

So naturally I took him up on it, and we spent an hour walking around the office and answering my numerous questions. Now he is a very busy man, especially this time of year, but I was impressed that a) he made good on his offer, and b) was so generous with his time with just an interested citizen who didn’t even live in his jurisdiction. (I live in nearby St. Louis City, which has its own government and elections board.)

There are about fifty people in the elections board offices, split evenly between Democrats and Republicans. Wait, what? You must declare your affiliation? Yes. That is the way the Missouri elections boards are run. Not every county is big enough to have an elections board: some of the smaller counties have a single county clerk running things. And Fey is the Democratic director. He introduced me to his Republican counterpart.

Part of the “fairness” aspect of our elections is that both parties must collaborate on how they are run, how the votes are tabulated, and how ballots are processed. And doing this within the various and ever-changing election laws in each state. We went into the tabulation room, which wasn’t being used. There were about a dozen computers that would be fired up a few days before the election, when they are allowed to start tabulating the absentee and mail ballots. These computers are not connected to the internet. They run special software from Hart InterCivic, one of the  election providers that the state has approved. These machines are never connected online. Okay, but what about the results?  Fey says, “We us brand new USB’s for a single transaction. We generate a report from the tabulation software and then load that report on the USB. That USB is then taken to an internet connected computer and the results are uploaded. That particular USB is then never used again in the tabulation room.”

Speaking of which, when the room is filled with workers, the door is secured by two digital locks and must be opened in coordination. Think of how nuclear missile silos are manned: in this case though, as you can see in the above photo, a Democrat must enter their passcode on their lock, and a Republican must enter their different passcode on their lock.

The Hart PCs have a hardware MFA key and are also password protected and have separate passwords for the two parties. What happens when they need new software? The county must then drive them to their Austin offices, where they are updated, in one of their vehicles, with both parties present at all times. This establishes a chain of custody and ensures they aren’t tampered with.

The elections board office is attached to a huge warehouse filled to the brim with several items: The voting machines that will be deployed to each polling place of course. The tablets that are used by poll workers (as shown here) to scan voters’ IDs (typically drivers licenses) and identify which ballot they need to use. These ballots are printed on demand, which is a good thing because that process eliminated a lot of human error in the past when voters got the wrong ballots. And loads of paper: the board is required to keep the last election’s ballots stored there. And commercial batteries spare parts for all the hardware too: because on election day, they travel around to keep everything up and running. Why batteries? In case of power failure in the polling place. Don’t laugh – it has happened.

Getting the right combination of polling places is more art than science, because the county has limited control over private buildings. One Y decided they didn’t want to be a polling place this year, and Fey’s staff found a nearby elementary school. Public buildings can’t decline their selection.

One thing Fey mentioned that I hadn’t thought about is how complex our ballots typically are. We vote for dozens of down-ballot races, propositions, and the like. In many countries, voters are just picking one or two candidates. We have a lot of democracy to deal with, and we shouldn’t take it for granted.

So how about ensuring that everyone who votes is legally entitled to vote? They have this covered, but basically it boils down to checking a new registration against a series of federal and other databases that indicate whether someone is a citizen, whether they live where they say they live, whether they are a felon, and whether they are deceased. These various checks convinced me that there aren’t groups of people who are trying to cast illegal votes, or bad actors who are harvesting dead voters. Fey and I spent some time going through potential edge cases and I was impressed that he has this covered. After all, he has been doing this for years and knows stuff. There have been instances where green card holders registered by mistake (they are allowed to vote in some Maryland and California local elections, but not here in Missouri) and then called the elections board to remove themselves from the voting rolls. They realize that a false registration can get them imprisoned or deported, so the stakes are high.

Let’s talk for a minute about accuracy. How are the votes tabulated? There are several ways. In Missouri, everyone votes using paper ballots. This isn’t typically a problem to process them, because as I said they are freshly printed out at the polling place and then immediately scanned in. This is how we can report our results within an hour of the polls closing. The ballots are collected and bagged, along with a cell phone to track their location, and then a pair drivers (D + R) head back to the office. Fey said there was one case where a car was in an accident, and the central war room that was tracking them called them before they had a chance to dial 911. They take their chain of custody seriously on election night.

If you opt for mail-in ballots, though, the ballot quality becomes an issue. Out of the hundreds of thousands of ballots the county office received in 2020, about four thousand or so looked like someone tried to light them on fire. Each of these crispy ballots had to be copied on to new paper forms so they could be scanned. Why so many? Well, it wasn’t some bizarre protest — it turns out that many folks were microwaving their ballots, because of Covid and sanitation worries. It was just another day of challenges for the elections board, but they took it in stride.

The paper ballots are then put through a series of audits. First the actual number of ballots are counted by machine to make sure the totals match up. They had one ballot that was marked with two votes, with one crossed out. So the team located the ballot and saw that the voter changed their mind, and corrected their totals. That is the level of detail that the elections board brings to the final count. They also pick random groups of ballots to ensure that the votes match what was recorded.

As you can see, they do their job, and I think they do it very well. If you are thinking about your own field trip, ballotpedia.org is a great resource if you want more details about how your state runs its elections, where and how to vote, and contacts at your local election agency.

CSOonline: Top IDS/IPS tools

An intrusion detection or prevention system can mean the difference between a safe network and a nasty breach. We’ve rounded up some of the best and most popular IDS/IPS products on the market.

Detecting and preventing network intrusions used to be the bread and butter of IT security. But over the past few years, analysts and defenders have seen a slow but steady transition from these products. They have become a component of a broader spectrum of network defensive tools, such as security information and event management (SIEM) systems, security orchestration and response (SOAR) and endpoint and network management and detection systems.

For CSO, I examined the top six commercial tools and four open source ones, explain the different approaches and form factors used, and compare how intrusion prevention fits into the overall security marketplace.

A new way to create podcasts using AI

I have been creating podcasts on and off — mostly off — since 2007, when Paul Gillin and I came up with the idea to talk to each other about war stories from the tech PR world. (You can listen to the very first pod here.) That series would eventually evolve into several different pods that Paul and I would do over the years, with the most recent episode here. In between these shows, I would freelance pods to various clients and publications such as eWeek and various IDG ones.

Dick and Jane: We LookI tell you this because today in the span of a few minutes, I managed to create some very credible podcasts out of previously just my written content, using a new Google tool called notebooklm.google.com. You upload your documents (PDFs or text files) and it converts them into two-host conversations that use some of it, along with using AI to bring in other information. The two “hosts” sound great: one is a male voice and the other a female voice. Call them Dick and Jane. The AI adds in almost the right amount of temporizing with “ums” and “likes” and back-and-forth byplay into the conversation. You can download the audio files here and hear it for yourself:

  • I wrote a handbook for CSOonline recently about AI security posture management. Here is the pod:
  • I also wrote an article for Internet Protocol Journal about the history of the Interop Shownet. Here is that pod:

I did almost no additional work to create these pods, other than search my own hard drive to find something that I wrote. Both of these samples are about ten minutes long. And while I wrote every word in both articles, the pods use examples that I never wrote (that were actually quite good) and bring in other information.) Using their ML routines to keep things more conversational works reasonably well and you almost believe that Dick and Jane are two live humans talking to each other about something that they “just read.” I could do with a few less inserted “likes” which seem to be the basic conversational building block of a certain generation. One thing that Google hasn’t coded into its system is to have the two hosts talk over each other, which I find annoying on other pods that have multiple human hosts.

Google’s tool is still very much in the experimental stage, but it is free to try out. In addition to creating podcasts, you can also query the content you upload just like any AI system, and it will also provide a summary and FAQs and other supporting things around your content. I would suggest that you don’t upload any private content however.

What does this mean for podcasters? Well, uh, things are going to get very interesting. While the Dick and Jane voices aren’t yet configurable, they are pleasant to listen to and seem 85% human. It also portends that my podcast business is probably dead in the water, not that I ever relied on it to produce any significant revenue. Given that I don’t cultivate any political outrage, or any outrage (other than from non-working tech or over-promised products), there was zero chance that my podcasting career would ever take off.

If you do produce some pods that you would like me to listen and compare to the original source materials, do drop a note in the comments.

Ways to harden your VPN

Susan Bradley writes today in CSOonline about ways to improve your password hygiene, especially if you are using a VPN to connect to your corporate network. I am horrified to report that I am guilty of doing Bad Things according to Bradley, and what is worse, that I should know better. Let’s review her suggestions:

First, one of the common attacks is taking advantage of password fatigue, whereby someone can gain access to your accounts by trying to figure out your password that was published on the dark web. She writes: “Too many people merely add a letter to a password rather than choosing a better passphrase.” That hand going up in the front of the room is my own. There is no excuse for it — I have a password manager that can make my passwords as complex as need be. Sometimes I add a character in the middle of my previous password. Far better to use multi-factor authentication, she says. I would agree with her, but many of the hundreds of my logins don’t support MFA. That is another travesty, to be sure. But color me lazy.

Another no-no is defending your login by looking for what is called “impossible travel” — whereby your login happens in one place, and your credentials are used in another place halfway across the planet shortly thereafter. VPNs check for this using location tracking. Wait, I thought this was good practice? Not any more: Bradley says this offers a false sense of security and we shouldn’t rely on geolocation blocking. Attackers have figured out ways around the blocks or obscure their locations.

Finally, she offers this wisdom: “It doesn’t hurt to reevaluate your current VPN platforms and consider alternatives such as managed-cloud VPN solutions, bearing in mind that MFA should be mandatory on all accounts.”

Bradley also runs AskWoody, another excellent resource.

Book review: Mapping St. Louis

Andrew Hahn’s delightful compendium of 40 rare maps of the St. Louis area is informative and an amazing record of the growth — and decline– of the region. He has put together maps from 1767 to the present, including some “fantasy maps” of how contemporary geographers envision the future infrastructure of the city. The maps show how the city developed around the confluence of the Missouri and Mississippi Rivers, and how events such as the Cyclone of 1896 and the fire of 1846 damaged various neighborhoods.

There are many different styles of maps featured, including maps for exploration and navigation, pocket and atlas maps, development and planning maps and pictorial maps.

Two places in the history of the city are chronicled with maps:

There are maps which show the massive population movements of the city — reaching a peak population of some 850,000 in 1950, only to decline to about 280,000 residents today.

Han is a seventh generation St. Louis native, and since 2003 he has worked as director of the Campbell House Museum, an 1851 townhouse in downtown St. Louis.

Andrew Hahn’s delightful compendium of 40 rare maps of the St. Louis area is informative and an amazing record of the growth — and decline– of the region. He has put together maps from 1767 to the present, including some “fantasy maps” of how contemporary geographers envision the future infrastructure of the city. The maps show how the city developed around the confluence of the Missouri and Mississippi Rivers, and how events such as the Cyclone of 1896 and the fire of 1846 damaged various neighborhoods.

There are many different styles of maps featured, including maps for exploration and navigation, pocket and atlas maps, development and planning maps and pictorial maps.

Two places in the history of the city are chronicled with maps:

There are maps which show the massive population movements of the city — reaching a peak population of some 850,000 in 1950, only to decline to about 280,000 residents today.

Hahn is a seventh generation St. Louis native, and since 2003 he has worked as director of the Campbell House Museum, an 1851 townhouse in downtown St. Louis.

Portable air pumps for bike and car: a work in progress

As an avid cyclist, I have collected a variety of tools to keep my tires inflated, both on the road and at home. These include:

  • A floor pump that has a pressure gauge and fits both Presta and Schrader valve types. We have both here in my family. This is used most frequently, because high-pressure tires tend to lose air over time.
  • A portable pump that has a Presta connection that I carry with me when I am riding. This is useful if I get a flat and have to inflate a new tire just enough to fit it on my rims.
  • A collection of CO2 high-pressure cartridges that can inflate my Presta tires up to full pressure. These are good for a single blast of air.

My social feeds have been filled with various ads for portable electric air pumps. In the distant past, these things were portable in the same sense that the first portable computers were: they were bulky and you wouldn’t want to carry them very far. But the latest generation of pumps weigh about a pound and could easily be carried with you on a ride, or fit in your car’s glove compartment. They range in price from $40 to $120, mostly made in China, and resemble an old-style walkie-talkie in dimensions. I bought one of the cheaper ones on AMZ.

The features that I wanted included:

  • Rechargeable battery via USB. The battery should last through a few inflations of your car tires, because you are pushing a lot more air. The unit that I got needed infrequent recharging. The pump’s screen should give you a rough idea of how much charge is available. Having a USB cable also makes recharging from your car simple, if you have the right cables.
  • A long enough hose that fits between the tire and the pump, so you can maneuver the pump around the target tire more easily.
  • Fits both kinds of tires. The way my pump does this is with a small and imminently losable adapter that screws on to the Schrader valve if you want to inflate a Presta tire. This adapter is very hard to fit on the valve stem, and gets tight enough to ensure you aren’t deflating your tire before you even start the inflation process. It took me a few tries to figure out the process of attaching and detaching the pump from both bike and car, and it would be easier if the vendor had two separate hoses, rather than the add-on adapter, but I didn’t find a unit that came that way.
  • Easy operation and usable screen. My pump shows the existing pressure when attached, and you can set it to stop automatically at a given pressure for both bike and car scenarios. That is helpful, especially for bike tires that can be easily overinflated. It also shows battery status too.

One question was could the new electric unit replace both my hand pump and CO2 cartridges on my rides? Even the smaller units still weigh more than the combination, but it is possible, although you probably have to carry it on your person or if you have a big enough bag on your bike. (My hand pump has a bracket to fit on my bike’s frame ) However, this means using the new pump twice when fixing a flat during a ride: once to get some air in the tire before you put it on the rim, then re-attach for the full inflation. It would be nice if the new pump could snap on and off the valve like my other pumps, but I didn’t see any units that offered that kind of mechanism.

I am not recommending my specific pump, and am calling the whole genre a work in progress. Some of them get very hot as their tiny pump motors work overtime to push the air through, especially for car tires. Some weigh a lot more making them difficult to carry in the back pockets of your jersey. The attach/detach process can be tricky: one time I unscrewed my valve stem completely when trying to remove the short air hose. And there doesn’t seem to be any relationship between price, quality, and user satisfaction from what I could tell.