Sunday, July 23, 2017

Two Missing P's

Two Ps From a pod

Arrgghh!!! I hate it when I outsmart myself!! I just finished a three-hour ordeal debugging why my app couldn't connect to my server.

I started by running the raw REST calls through Paw to make sure I was actually calling the REST endpoint correctly and I could create records on the backend.

Since this is an authenticated service I had to first sign-in, so I had to set up a separate configuration for that to get the user's token.

Next I setup the actual call to post the record, putting the user's email and their token in the header and the data for the record I wanted to create into the body (as JSON).

I ran it and everything worked fine. So it should have been as simple as do the same thing in the app. Riiiiiigggghhht.

In the app I'm using AlamoFire to make the REST calls and after successfully logging in I went on to (attempt) to create the new record.

This is what I got in the rails console:

User Load (2.1ms)  SELECT  "users".* FROM "users" WHERE "users"."email" = $1 ORDER BY "users"."id" ASC LIMIT $2  [["email", "test@home.com"], ["LIMIT", 1]]
Completed 400 Bad Request in 19ms (ActiveRecord: 2.1ms)

ActionController::ParameterMissing (param is missing or the value is empty: portfolio):

I know the load of the user is just Devise's before_action kicking in. But why don't I see even the parameters being passed in?

Ok, I thought, I have seen the param missing error before and it means for some reason my payload isn't getting to the server.

So I put in a few debug statements on the Rails side, specifically:

logger.debug "raw post: #{request.raw_post}"
logger.debug "params: #{params.as_json}"
logger.debug "local_params: #{portfolio_params.as_json}"

Note: portfolio_params is my "white list" method for the controller. This resulted in the following output:

User Load (2.5ms)  SELECT  "users".* FROM "users" WHERE "users"."email" = $1 ORDER BY "users"."id" ASC LIMIT $2  [["email", "test@home.com"], ["LIMIT", 1]]
raw post: {"portfolio":{"store_on_server":false, ... }}
params: {"controller"=>"api/v1/portfolios", "action"=>"create", "format"=>"json", "user_email"=>"test@home.com", "user_token"=>"_Csa8xXWzV-sfcazxdwJ"}
Completed 400 Bad Request in 11ms (ActiveRecord: 2.5ms)

Huh? Line 2 confirms my portfolio body is getting to the server, but when I look at the params on line 3, it's no where to be found.

Hmm, so I Google'd how to set the body in Swift 3 using AlamoFire and I found this post:

How to set body on post with AlamoFire

I looked at the example's syntax:

Alamofire.request("http://myserver.com", method: .post, parameters: parameters, encoding: JSONEncoding.default)
    .responseJSON { response in
        print(response)
    }

Here is what I had:

let request = Alamofire.request(urlString, method: .post, parameters: json, encoding: JSONEncoding.default, headers: headerCacheManager.getLatestHeaderFields())

Uhh, mine is the same EXCEPT for the call to add the header fields, but what could be wrong there? Surely the idiot programmer (me), who wrote the getLatestHeaderFields() method didn't mess that up right? I mean I have been successfully using that for months. So no need to check that right!?

Next I printed out the actual JSON on the app side, prior to passing it to AlamoFire:

print("JSON is: \(json.debugDescription)")

It looked right to me. Now what?

So I went old school. First I got the curl representation of what was working from Paw:

curl -X "POST" "http://localhost:3000/api/v1/portfolios" \
     -H "X-User-Token: _Csa8xXWzV-sfcazxdwJ" \
     -H "X-User-Email: test@home.com" \
     -H "Cookie: _pr_session=Uk1vVFUxSmRIS3J2YVN6UUlTRUczbzg4KytIbC9rajdXKzZTUDFiZ0laTnRzaTdCdERDYUFscmE0a05Zd3Y0eGdBV0lYKzNiWFIwOVpjM0Z0c3dpeWVwaWpsZWM2N1Nkb2xWZWRsM2VFazVybDhBd1Jaa2d1OHhQS2ZnQm1pbmo2bzZGTXBSSEh5Y3FtWkZZRnNnUzZFYmVTamVKTTVUUUNHaDFHOTdwb0h3VVNWWVBiS3ptNUlQUHI2Mk9iZXE1aTdnQi9aWk5HbnpzV3dlRXAyNzdFczVrckt5NEwzMk02QmRYSWxyS2IwbndzRFNXNDVsSzdYQlNVd0o4TUd3aC0tQVN0YkV2MWtXTGJyYVl3S0IxaHNLUT09--32ceac68ff051bf41d3437ecdc6b04730c00cc9d" \
     -H "Content-Type: application/json" \
     -H "Accept: application/json" \
     -d $'{
  "name": "Portfolio",
  "uuid": "uuid1"
}'

Pasting this into the console worked fine as well. But how could I see the request as it left the client? That's what I needed.

Eventually I figured out I could print a debug description of the AlamoFire request object which would give me the 'curl' command the app was using.

print("Request: \(request.debugDescription)")

The output of that was:

Request: $ curl -v \
    -X POST \
    -H "Content-Type: aaplication/json" \
    -H "X-User-Token: _Csa8xXWzV-sfcazxdwJ" \
    -H "Accept: aaplication/json" \
    -H "User-Agent: PortfolioRebalancer/1.0 (com.talonstrikesoftware.PortfolioRebalancer; build:1; iOS 10.3.1) Alamofire/4.5.0" \
    -H "Accept-Language: en;q=1.0" \
    -H "X-User-Email: test@home.com" \
    -H "Accept-Encoding: gzip;q=1.0, compress;q=0.5" \
    -d "{\"portfolio\":{\"store_on_server\":false, ...}" \
    "http://localhost:3000/api/v1/portfolios.json"

Do you see the problem? I didn't so I copied this string into a terminal window, executed it, and it failed.

Ok on to comparing each parameter. Do you see the problem now? Yeah, whether it was because it was late, or my monitor font was too small, or the fact I wrote this code months ago, it took me forever to see the problem.

It turns out about a week ago I had refactored the HeaderCacheManager class and I made a typo. What should have been:

headers["Accept"] = "application/json"
headers["Content-Type"] = "application/json"

was actually:

headers["Accept"] = "aaplication/json"
headers["Content-Type"] = "aaplication/json"

I misspelled the word application Doh!!!

Fixing those two p's the server responded like I would have expected:

Started POST "/api/v1/portfolios.json" for 172.17.0.1 at 2017-07-20 21:43:20 +0000
Processing by Api::V1::PortfoliosController#create as JSON
  Parameters: {"portfolio"=>{"store_on_server"=>false, ...}}
  User Load (2.9ms)  SELECT  "users".* FROM "users" WHERE "users"."email" = $1 ORDER BY "users"."id" ASC LIMIT $2  [["email", "test@home.com"], ["LIMIT", 1]]
  Portfolio Load (2.2ms)  SELECT  "portfolios".* FROM "portfolios" WHERE "portfolios"."uuid" = $1 ORDER BY "portfolios"."id" ASC LIMIT $2  [["uuid", "43E48CC3-286E-4CB2-914E-EBEA69707480"], ["LIMIT", 1]]
   (0.3ms)  BEGIN
  User Load (1.9ms)  SELECT  "users".* FROM "users" WHERE "users"."id" = $1 LIMIT $2  [["id", 1], ["LIMIT", 1]]
  SQL (1.8ms)  INSERT INTO "portfolios" [blah, blah ,blah]
   (0.5ms)  COMMIT
Completed 201 Created in 36ms (Views: 0.6ms | ActiveRecord: 9.6ms)

Done!!! In the end that request.debugDescription saved my bacon and I guess I just need to learn to spell better (or type better I'm not sure)

Till next time.

Monday, April 10, 2017

Plan A

PlanA

The past few weeks I have been working on my new app. It is a stock tracking app and I ran into the issue of how to get live stock prices for free.

To be honest I would pay a small monthly fee for this capability, but the sites I went to that alluded to the fact they might have a service I could pay for didn't publish their prices. They just said "contact us".

I figured that meant I couldn't afford them.

So that sent me down the tried and heavily worn paths such as yahoo finance and screen scraping.

For my application, I need three bits of information, given the equity's symbol.

I needed:

  • Title
  • Description
  • Latest price

In my use case, the user enters the symbol, and I then retrieve the name and description to allow them to confirm this is what they want to use.

Later in the workflow, at specific touch points in the app, I want to retrieve the latest prices for all the equities the user is tracking.

In the end I decided I would choose a hybrid approach.

Most of the solutions I found pointed at using yahoo finance for live stock data. The problem with that was I could not find a way to get the equity's description information.

The only way I could figure to get the description information was lookup the symbol on google and then screen scrape the information.

So my hybrid solution is to screen scrape the information when the user is searching for an equity to track and then once they are tracking it use yahoo finance to pull the live data.

One caveat before I go into the code: I realize that yahoo has been bought out and there is no guarantee the solution I have will work even in the near future. That's one of the reasons I chose to have two routes to the data. But for now this works pretty well.

For this post I wanted to dive into the integration I have for yahoo finance. In a future post I'll talk about the screen scraping approach.

Prerequisites for this solution are: Alamofire

To make this clean and isolated, I created a StockService class like so:

import Foundation
import Alamofire

/**
 A struct to hold a quote for an equity
 */
struct EquityQuote {
    var symbol:String = ""
    var price: Double = 0.0
}

/**
 The Stock Service provides functions to lookup stocks and retrieve their latest prices
 
 - notes:
   see https://developer.yahoo.com/yql/
 */
class StockService {

I created a struct to capture the loaded data and as the comment mentions I lean on yahoo's yql service

The idea is consuming code/classes will call the function on this service object whenever the quotes need to be retrieved. The details of how the service is found (fyi, it's not a singleton) is not germaine to this post. Maybe I should add that to my list of topics to blog about.

At any rate, just assume consumers can get to the service and call the getLatestQuotes method:

func getLatestQuotes(symbols:[String], completionHandler:@escaping (([EquityQuote]) -> Void)) {

The function takes an array of symbols (as strings) and a completion handler function that will return a list of EquityQuote objects.

There are a lot of old posts out there about how to do this, some use YQL and others point to yahoo.finance directly. After a bit of experimentation I came to the conclusion that the YQL solution was more correct for 2017.

Another thing that annoyed me was the url call has to be url encoded (i.e. spaces turned into their safe ASCII equivalents, etc). A lot of the examples you will find hard code the conversion in their strings.

To me this made it hard to look at and debug. So I chose to write as I would in the yql console and right before making url encode it.

So I defined constants for the things in the query that don't change. More specifically the host, the query prefix and the query suffix:

let QUOTE_QUERY_HOST = "http://query.yahooapis.com/v1/public/yql"
let QUOTE_QUERY_PREFIX = "?q=select * from yahoo.finance.quotes where symbol in ( "
let QUOTE_QUERY_SUFFIX = " )&format=json&env=store://datatables.org/alltableswithkeys&callback="

Next I convert the symbols array into a single string and then cobble together the entire url.

var symbolsString = symbols.reduce("") {text, value in "\(text)\"\(value)\", "}
symbolsString = symbolsString.truncate(by:2)

let finalQuery = "\(QUOTE_QUERY_PREFIX)\(symbolsString)\(QUOTE_QUERY_SUFFIX)"
let encodedQuery = finalQuery.urlEncode()
let urlToCall = "\(QUOTE_QUERY_HOST)\(encodedQuery)"

Note the call to finalQuery.urlEncode(). This is a function I put in a String extension to encode all the characters that need to be encoded to go across the wire. I won't include that here, but if anyone is interested let me know.

Now that I have a safe encoded url, it is time to make the call via Alamofire:

let request = Alamofire.request(urlToCall, method: .get, parameters: nil, encoding: JSONEncoding.default, headers:nil)
    request.responseJSON() { response in
        let result = response.result
        switch result {

As you can see it is a "get" call and I expect JSON back.

If I get a success then it's time to parse it. This part of the code is UGLY!! and I need to go back and use something like "swiftyJSON" to clean up all the checks. But, at the time, I was doing a lot of experimenting to get this to work and it was easier to just peel one layer off at a time.

case .success:
    if let value = result.value as? [String: Any] {
         if let query = value["query"] as? [String: Any] {
              if let results = query["results"] as? [String: Any]  {
                   if let quotes = results["quote"] as? [[String: Any]] {
                        var equityQuotes = [EquityQuote]()
                        for quote in quotes {
                            let symbol = quote["symbol"]
                            let lastTradePrice = quote["LastTradePriceOnly"]
                            if let symbol = symbol as? String, let priceText = lastTradePrice as? String, let price = Double(priceText) {
                                 let equityQuote = EquityQuote(symbol: symbol, price: price)
                                 equityQuotes.append(equityQuote)
                            }
                        }
                    completionHandler(equityQuotes)
                    return
                }
            }
        }
    }
    completionHandler([EquityQuote]())
case let .failure(error):
    print("\(error)")
}

What is going on here is I basically am diving into the returned JSON until I get to the "quote" dictionary. I then pull out the information I need to build an EquityQuote object and if I have them, I construct the object and throw it into the array I will return. Finally after processing all of that, I call the completion handler to notify the calling code we are done.

Things I still have to do is deal better with error conditions. It may be that I change the method signature to take a failure handler as well.

Note, if I get nothing from the call I still return an empty array to the caller. I'm not sure in what cases this would happen but it seemed like the right thing to do in order to continue the code flow.

I could have just returned an array of doubles, but I figured by returning these EquityQuote objects the calling code could inspect the symbol of each one and match it up with the symbols it cares about. Again, it seemed safer.

That's it, so far it is working pretty well. My next post will talk about the screen scraping approach I took. Till next time.

Sunday, March 5, 2017

Back in the Game

Mac_Update2

Mac update and

Its been a while since I last made a post so it is about time.

I wrote this back in January, I just never posted it. First, my 2016 MacBook Pro (MBP) saga update.

I was having so much trouble converting from using my MBP in a mobile configuration back to what I call my "desktop" configuration (which is clamshell mode with two external 4k moniters attached) that I came real close to taking it back.

It would rarely make the conversion from internal GPU to discrete GPU without either crashing or just losing it's mind. I could never count on the transition to work. I could usually count on it to NOT work.

But with the update to macOS 10.12.3 (which hinted it fixed a GPU switching problem) it has become much more stable. I still don't go from "mobile" (clamshell closed) mode to "desktop" mode without first opening the clamshell prior to connecting the monitors. But it is much better now.

It's not perfect, but now I have a 50/50 chance of waking my computer up by just hitting a few keys on the keyboard.

The battery situation still isn't great but seems to have gotten a little better with the macOS update and the latest version of Chrome.

So in the end it took over two months but I am at least satisfied with my purchase. Here is a picture of my setup:

On the development front, I have started two new apps. One is a financial app for iOS and the other is my first game using Unity. Both are a long way away but I am excited about the endeavors.

If I had to guess, I would expect the finance app to take about 6 months and the game may never be released (if I am truly honest with my time commitments).

Now to the more interesting stuff. My side web app project.

This project is a Ruby on Rails backend supporting (right now) a mobile app via a REST api. In order for us to release the product though, we also will need a web app.

I built a basic admin portal for the database and, several months ago, built the first part of the web client using React and the 'react-rails' gem.

As time went by, our design team turned to focus on the mobile side and as (often happens in startups) the web design languished as we discovered new workflows and data structures needed for the app.

That gave me time to read and study how best to build the web front-end based on React for a RoR backend.

After much studying I decided I should use webpack and build the React client with a true Javascript tool chain.

I first started by rolling my own before I found the 'react-on-rails' gem. This has an opinionated way of integrating a Javascript toolchain into a RoR application.

It looks good but I find the example's a little hard to grok. Whether that is my lack of knowledge of RoR or Node I'm not sure. I did read yesterday that RoR 5.1 will have Webpack support baked in, so I am looking forward to that.

I think this is where I will stop today. Till next time.

Friday, November 11, 2016

Why I'm Buying a new Macbook Pro

BuyNewMac

I'm going to take a diversion from my regular development subjects and talk about my decision of whether to purchase a new Macbook Pro. A lot has been written about a machine that has barely even hit the market, so I thought I would give one developer's perspective.

Background

Let me give a bit of a background:

I currently use a late model 2011 13 inch 2.8 GHz i7 MacBook Pro (non-retina). It was originally configured with a 740GB hard drive and 4 GB of RAM.

This has been a phenomenal machine. I primarily use it for development of mobile and web applications. I also run several graphics programs and tools to support my development efforts.

You can read my adventures in those areas in my other posts.

Since I bought this machine, I upgraded it to 16GB of RAM and this past summer I could no longer deal with the long compile times, so I installed a 450GB SSD.

I primarily use it connected to an external 4K display and go mobile about 3 to 4 times a week for an average of about 6 hours each week. Except on holidays when I visit family and those times I am 100% mobile.

I estimate, on average, I have used it about 1000 hours per year, or about 5000 hours in total. I figure just the hardware costs have been around $1850 plus tax. This gives me a total cost of ownership of 37 cents per hour.

For the last 18 months I have limped along resisting the need to upgrade to a new Mac because I was waiting on the Skylake processor based ones. That just never seemed to materialize.

Had I not installed that SSD, I would have had to upgrade regardless of an impending new model release. Compile times had gotten so sluggish it was almost impossible to develop.

There are also several things I wanted in my development environment. One is I wanted to run two external monitors. My current machine can not do that. Additionally, when I am mobile there just isn't enough screen real estate to use Xcode efficiently.

I mean typing code is ok, but when lines get long they have to be wrapped a good bit and using interface builder is practically impossible on the 13 inch screen.

So I feel like I am due. I have been disappointed so many times, hoping for a an announcement.

The Announcement

So when the October announcement came, all I could say was FINALLY!!!

Going into the announcement there were some things that concerned me. I knew (or figured at least) the new Macs would not be upgradable. So I knew I was going to have to pay for the configuration I wanted, up front.

I was also thinking, to help alleviate the screen real estate problem, while mobile, I would need to move up to a retina 15 inch model.

As an added benefit the 15 inch model has stronger CPU and graphics capabilities.

My plan was to purchase the new Mac and an ASUS external monitor. I figured based on the current model prices, I would be able to get a decent machine for less than $3K.

The Contemplation

Unfortunately, as we now know, Apple raised the prices. Dramatically, if you ask me. After specking out the new machine with the required adapters, I am topping out over $3500.

It sickens me that I am paying such a high price (the cost of a serviceable used car). So I am certainly having buyer's remorse.

I have seen other posts from others claiming they are done with Apple, because of the price and the removing of all ports except USB-C ports. These posters state they are going back to Windows.

In fact this new machine will actually cost more than the just announced base Microsoft Surface Studio machine. Which looks pretty nice and innovative if you ask me.

But going to Windows just isn't an option for me to continue to support my current development efforts.

I have to run OS X to build iOS base mobile apps. So no matter how much I might want to purchase a more powerful laptop at half the price, I can't because it can't run OS X.

If it was I would seriously consider it.

Let me be clear, I am not a Windows hater and I truly believe without the advances and marketing Microsoft did with it's Windows OS, we would not be where we are today in technology. I think we owe a lot to them. So I don't understand when all the Apple "Fan-Boys" say they hate Windows when nine times out of ten they can't say specifically why. Haters gotta hate, I guess, but I'm not one of those.

Anyway, friends have reminded me that back in the 80's and 90's we used to pay that kind of money for even the most basic of machines. For example, my first PC cost $3200 in 1989 and I had to get a loan for it. But that was then, this is now.

I still think this new machine just shouldn't be this expensive for what you get. But it is.

As far as it only having USB-C ports, it doesn't bother me, since I basically 'doc' the machine when I am home I am used to have cables and adapters, etc. So that argument is a non-starter with me.

So that brings me to the moment of making a decision.

The Analysis

Here is how I did the math.

If I assume going from the 13 inch I have, to a 15 inch of the same year would have a reasonable added cost of about %20, then I am at $2220.

If I then assume a %3 (per year) inflation rate (which we all know actually has been %0 for the last 5 years) I get a total cost of $2573.59.

Hmm, very interesting, this is real close to what I had budgeted for the new machine plus a $400 external monitor.

So now the delta from what I think I should pay to what I am paying is about $1000. Whew, that's a stretch.

To be fair, I upgraded to the fastest CPU, graphics card and increased the SSD to 1TB, so I could see adding another $400 for that.

That brings my delta to $600. Getting there but not quite. If the delta at this point had been less than $200 this would be a slam dunk. But it isn't so I had to fret over it. A LOT!!

I also, briefly, considered the "hackintosh" route, but I want to code, not configure a machine for the rest of my life. So I dismissed that option.

Another option, is to trudge along with my existing machine till it dies or no longer supports the latest version of OS X. The problem with this is, these things always seem to die when you least can afford the lost time. Not to mention it takes a good bit of time to configure a new machine (especially for a developer) to be productive again. That prospect wasn't very appealing to me.

The final option I considered was to purchase the previous model. I actually spec'd that out and my cost was already at $2500. That was $400 less than what I consider a reasonable cost of this new model is and $1000 less than my actual drive out cost. But getting a machine that is 18 months old wasn't very appealing either. It's certainly not "sexy".

My next question was how long would I need to keep the new machine to get down to the 37 cents per hour cost of ownership? Running the quick math, I get 9459 hours or about 9.5 years.

I'll never make that. For comparison, the previous model I would need to keep it for 6.7 years.

The prudent thing to do would have been to get the previous model as I might be able to eek out 6.7 years, I can't see making it or even wanting to make it 9.5 years.

The Decision

But in the end, I figured the amount of analyzing and contemplating I had done on this had already cost me a good bit, so I just closed my eyes and hit the "buy" button. I just have to do some "creative" financing to figure out how to pay for it.

I figure if I don't like the new machine (the touch-bar is of concern to me) I have 14 days to change my mind and if that fails, then this will be the newest model for about a year, I could always sale it in 6 months or so and maybe recoup about 75% of my original cost.

I would love to hear other's thoughts (especially iOS developers) on how they have justified upgrading, or if I missed something in my thought process. Till next time.

Tuesday, September 20, 2016

Hiding the Toolbar of a UINavController

NavVCToolbars

So I am in the final stages of finishing up the latest revision of Pain Logger. I had hoped to get it in the app store prior to iOS 10 but I just couldn't make it.

At any rate, iOS 10 and the iPhone hit the stores this week and of course sales jumped.

Sales of the old version, of course :-(

So I got an email from one of my customers asking for a feature.

Since I am basically dev complete I figured, prior to submitting to the store, I would try to squeeze this last requested feature in.

After looking at the requirement I decided the UX for this would be a toolbar to provide a control to activate the feature.

The feature isn't important, but what is important was how I got this to work.

The UI this will go on looks like and behaves like a UITableViewController but it isn't implemented that way.

I moved away from using UITableViewControllers in my UIs and instead use a vanilla UIViewController and then place UITableViews in their UIView.

I find this works best as my UIs change.

This particular view is the third view in a stack of views inside a UINavigationController stack.

I wanted the navigation bar to show on all views, but I only wanted a toolbar to show on the bottom of the view in question, I'll call it the DataVC from here on out.

I also wanted to enable the behavior you see in other apps where when you scroll up on the table, the navigation bar AND the toolbar move out of the way.

And of course when I swipe down, I wanted the navigation bar and the toolbar to come back.

So the first issue I ran into was how to get the toolbar on the view to begin with. To do that, in interface builder (IB), I turned on the switch to show the toolbar on the UINavigationController itself.

This caused all the connected ViewControllers in the stack to get a toolbar.

So in the viewDidAppear method in each ViewController I turned off the toolbar until I got to the DataVC like this:

navigationController?.toolbarHidden = true

Now we get to the DataVC itself. Originally the controls in DataVC were configured in the following order:

  • NavigationBar
  • UITableView
  • UIView (to hold ads)

Everything was properly laid out with constraints. I then added these two lines to the DataVC viewDidAppear method:

    navigationController?.toolbarHidden = false
    navigationController?.hidesBarsOnSwipe = true

This got the swiping behavior I was looking for going. But when I swiped up the NavBar didn't come back.

I found this stack overflow post which solved the problem.

The trickiest part was changing the Top constraint on the UITableView. The answer of how to do that is in the StackOverflow link.

After that fix, the bars would come back correctly, but cells in the tableview were hidden by the NavBar and the toolbar.

The fix was to turn off the settings for "under top" and "bottom bars" for the DataVC itself.

At this point it all worked, pretty simple really, the biggest issue was the one solved by the StackOverflow fix I found.

There are some "gotchas" though.

Once you turn on the toolbar and enable the swipe to hide functionality, any subsequent ViewController pushed on the stack will have the configuration as when the DataVC was left.

This wasn't the behavior I wanted.

I only wanted to do it on the DataVC, so at first I was dismayed that I would need to reverse these settings for any ViewController I pushed on the stack after the DataVC.

The key to keep the configuration localized to the DataVC is to use the viewWillDisappear method. So to put everything back, I implemented the method like this:

  override func viewWillDisappear(animated: Bool) {
    super.viewWillDisappear(animated)
    navigationController?.toolbarHidden = true
    navigationController?.hidesBarsOnSwipe = false
    navigationController?.setNavigationBarHidden(false, animated: true)
  }

Works perfectly.

Another issue, that kind of irked me, was that when I added the toolbar to the root UINavigationController, a number of my constraints on other UIViewControllers started giving me warnings.

This was because I had views that had bottom constraints connected to the bottom of the superview. But when IB puts in the toolbar it leaves the constraint at zero but moves the view up 44 points.

So you get a bunch of warnings when before everything was clean.

I had to review each and adjust accordingly. Annoying, yes, but after thinking about it, it made some sense. I wish IB would have been smarter on that.

Oh well, till next time.