Friday, November 11, 2016

Why I'm Buying a new Macbook Pro

BuyNewMac

I'm going to take a diversion from my regular development subjects and talk about my decision of whether to purchase a new Macbook Pro. A lot has been written about a machine that has barely even hit the market, so I thought I would give one developer's perspective.

Background

Let me give a bit of a background:

I currently use a late model 2011 13 inch 2.8 GHz i7 MacBook Pro (non-retina). It was originally configured with a 740GB hard drive and 4 GB of RAM.

This has been a phenomenal machine. I primarily use it for development of mobile and web applications. I also run several graphics programs and tools to support my development efforts.

You can read my adventures in those areas in my other posts.

Since I bought this machine, I upgraded it to 16GB of RAM and this past summer I could no longer deal with the long compile times, so I installed a 450GB SSD.

I primarily use it connected to an external 4K display and go mobile about 3 to 4 times a week for an average of about 6 hours each week. Except on holidays when I visit family and those times I am 100% mobile.

I estimate, on average, I have used it about 1000 hours per year, or about 5000 hours in total. I figure just the hardware costs have been around $1850 plus tax. This gives me a total cost of ownership of 37 cents per hour.

For the last 18 months I have limped along resisting the need to upgrade to a new Mac because I was waiting on the Skylake processor based ones. That just never seemed to materialize.

Had I not installed that SSD, I would have had to upgrade regardless of an impending new model release. Compile times had gotten so sluggish it was almost impossible to develop.

There are also several things I wanted in my development environment. One is I wanted to run two external monitors. My current machine can not do that. Additionally, when I am mobile there just isn't enough screen real estate to use Xcode efficiently.

I mean typing code is ok, but when lines get long they have to be wrapped a good bit and using interface builder is practically impossible on the 13 inch screen.

So I feel like I am due. I have been disappointed so many times, hoping for a an announcement.

The Announcement

So when the October announcement came, all I could say was FINALLY!!!

Going into the announcement there were some things that concerned me. I knew (or figured at least) the new Macs would not be upgradable. So I knew I was going to have to pay for the configuration I wanted, up front.

I was also thinking, to help alleviate the screen real estate problem, while mobile, I would need to move up to a retina 15 inch model.

As an added benefit the 15 inch model has stronger CPU and graphics capabilities.

My plan was to purchase the new Mac and an ASUS external monitor. I figured based on the current model prices, I would be able to get a decent machine for less than $3K.

The Contemplation

Unfortunately, as we now know, Apple raised the prices. Dramatically, if you ask me. After specking out the new machine with the required adapters, I am topping out over $3500.

It sickens me that I am paying such a high price (the cost of a serviceable used car). So I am certainly having buyer's remorse.

I have seen other posts from others claiming they are done with Apple, because of the price and the removing of all ports except USB-C ports. These posters state they are going back to Windows.

In fact this new machine will actually cost more than the just announced base Microsoft Surface Studio machine. Which looks pretty nice and innovative if you ask me.

But going to Windows just isn't an option for me to continue to support my current development efforts.

I have to run OS X to build iOS base mobile apps. So no matter how much I might want to purchase a more powerful laptop at half the price, I can't because it can't run OS X.

If it was I would seriously consider it.

Let me be clear, I am not a Windows hater and I truly believe without the advances and marketing Microsoft did with it's Windows OS, we would not be where we are today in technology. I think we owe a lot to them. So I don't understand when all the Apple "Fan-Boys" say they hate Windows when nine times out of ten they can't say specifically why. Haters gotta hate, I guess, but I'm not one of those.

Anyway, friends have reminded me that back in the 80's and 90's we used to pay that kind of money for even the most basic of machines. For example, my first PC cost $3200 in 1989 and I had to get a loan for it. But that was then, this is now.

I still think this new machine just shouldn't be this expensive for what you get. But it is.

As far as it only having USB-C ports, it doesn't bother me, since I basically 'doc' the machine when I am home I am used to have cables and adapters, etc. So that argument is a non-starter with me.

So that brings me to the moment of making a decision.

The Analysis

Here is how I did the math.

If I assume going from the 13 inch I have, to a 15 inch of the same year would have a reasonable added cost of about %20, then I am at $2220.

If I then assume a %3 (per year) inflation rate (which we all know actually has been %0 for the last 5 years) I get a total cost of $2573.59.

Hmm, very interesting, this is real close to what I had budgeted for the new machine plus a $400 external monitor.

So now the delta from what I think I should pay to what I am paying is about $1000. Whew, that's a stretch.

To be fair, I upgraded to the fastest CPU, graphics card and increased the SSD to 1TB, so I could see adding another $400 for that.

That brings my delta to $600. Getting there but not quite. If the delta at this point had been less than $200 this would be a slam dunk. But it isn't so I had to fret over it. A LOT!!

I also, briefly, considered the "hackintosh" route, but I want to code, not configure a machine for the rest of my life. So I dismissed that option.

Another option, is to trudge along with my existing machine till it dies or no longer supports the latest version of OS X. The problem with this is, these things always seem to die when you least can afford the lost time. Not to mention it takes a good bit of time to configure a new machine (especially for a developer) to be productive again. That prospect wasn't very appealing to me.

The final option I considered was to purchase the previous model. I actually spec'd that out and my cost was already at $2500. That was $400 less than what I consider a reasonable cost of this new model is and $1000 less than my actual drive out cost. But getting a machine that is 18 months old wasn't very appealing either. It's certainly not "sexy".

My next question was how long would I need to keep the new machine to get down to the 37 cents per hour cost of ownership? Running the quick math, I get 9459 hours or about 9.5 years.

I'll never make that. For comparison, the previous model I would need to keep it for 6.7 years.

The prudent thing to do would have been to get the previous model as I might be able to eek out 6.7 years, I can't see making it or even wanting to make it 9.5 years.

The Decision

But in the end, I figured the amount of analyzing and contemplating I had done on this had already cost me a good bit, so I just closed my eyes and hit the "buy" button. I just have to do some "creative" financing to figure out how to pay for it.

I figure if I don't like the new machine (the touch-bar is of concern to me) I have 14 days to change my mind and if that fails, then this will be the newest model for about a year, I could always sale it in 6 months or so and maybe recoup about 75% of my original cost.

I would love to hear other's thoughts (especially iOS developers) on how they have justified upgrading, or if I missed something in my thought process. Till next time.

Tuesday, September 20, 2016

Hiding the Toolbar of a UINavController

NavVCToolbars

So I am in the final stages of finishing up the latest revision of Pain Logger. I had hoped to get it in the app store prior to iOS 10 but I just couldn't make it.

At any rate, iOS 10 and the iPhone hit the stores this week and of course sales jumped.

Sales of the old version, of course :-(

So I got an email from one of my customers asking for a feature.

Since I am basically dev complete I figured, prior to submitting to the store, I would try to squeeze this last requested feature in.

After looking at the requirement I decided the UX for this would be a toolbar to provide a control to activate the feature.

The feature isn't important, but what is important was how I got this to work.

The UI this will go on looks like and behaves like a UITableViewController but it isn't implemented that way.

I moved away from using UITableViewControllers in my UIs and instead use a vanilla UIViewController and then place UITableViews in their UIView.

I find this works best as my UIs change.

This particular view is the third view in a stack of views inside a UINavigationController stack.

I wanted the navigation bar to show on all views, but I only wanted a toolbar to show on the bottom of the view in question, I'll call it the DataVC from here on out.

I also wanted to enable the behavior you see in other apps where when you scroll up on the table, the navigation bar AND the toolbar move out of the way.

And of course when I swipe down, I wanted the navigation bar and the toolbar to come back.

So the first issue I ran into was how to get the toolbar on the view to begin with. To do that, in interface builder (IB), I turned on the switch to show the toolbar on the UINavigationController itself.

This caused all the connected ViewControllers in the stack to get a toolbar.

So in the viewDidAppear method in each ViewController I turned off the toolbar until I got to the DataVC like this:

navigationController?.toolbarHidden = true

Now we get to the DataVC itself. Originally the controls in DataVC were configured in the following order:

  • NavigationBar
  • UITableView
  • UIView (to hold ads)

Everything was properly laid out with constraints. I then added these two lines to the DataVC viewDidAppear method:

    navigationController?.toolbarHidden = false
    navigationController?.hidesBarsOnSwipe = true

This got the swiping behavior I was looking for going. But when I swiped up the NavBar didn't come back.

I found this stack overflow post which solved the problem.

The trickiest part was changing the Top constraint on the UITableView. The answer of how to do that is in the StackOverflow link.

After that fix, the bars would come back correctly, but cells in the tableview were hidden by the NavBar and the toolbar.

The fix was to turn off the settings for "under top" and "bottom bars" for the DataVC itself.

At this point it all worked, pretty simple really, the biggest issue was the one solved by the StackOverflow fix I found.

There are some "gotchas" though.

Once you turn on the toolbar and enable the swipe to hide functionality, any subsequent ViewController pushed on the stack will have the configuration as when the DataVC was left.

This wasn't the behavior I wanted.

I only wanted to do it on the DataVC, so at first I was dismayed that I would need to reverse these settings for any ViewController I pushed on the stack after the DataVC.

The key to keep the configuration localized to the DataVC is to use the viewWillDisappear method. So to put everything back, I implemented the method like this:

  override func viewWillDisappear(animated: Bool) {
    super.viewWillDisappear(animated)
    navigationController?.toolbarHidden = true
    navigationController?.hidesBarsOnSwipe = false
    navigationController?.setNavigationBarHidden(false, animated: true)
  }

Works perfectly.

Another issue, that kind of irked me, was that when I added the toolbar to the root UINavigationController, a number of my constraints on other UIViewControllers started giving me warnings.

This was because I had views that had bottom constraints connected to the bottom of the superview. But when IB puts in the toolbar it leaves the constraint at zero but moves the view up 44 points.

So you get a bunch of warnings when before everything was clean.

I had to review each and adjust accordingly. Annoying, yes, but after thinking about it, it made some sense. I wish IB would have been smarter on that.

Oh well, till next time.

Sunday, August 21, 2016

It Should Have Just Worked

Out of the Dark

I was looking at my blog today and I realized I have been slacking off a bit when it comes to making updates.

So I am going to correct this.

First, What have I been doing?

Well, between working my regular day job, my two main side projects, family, and just life in general, there hasn't been much time for blogging.

With this post I intend to do the following:

  • Update my status on my side projects
  • Identify future topics
  • Document my latest "eureka" moment. (More of a "duh" moment really)

Pain Logger

Pain Logger is my main mobile app.

I have two versions on Apple's app store, a free version, with limited functionality, and a paid version.

Both apps were written in Objective-C and I have been extremely busy rewriting the paid version entirely in Swift.

If I had to choose to do this over, I would have advised myself not to rewrite the entire app.

But, here I am, and I am finally starting to see the light at the end of the tunnel, or the on-coming train, I'm not sure which.

I'm still struggling about what to do with the free version and how to or if I should merge the two code bases.

be

Be (we actually spell it with a lower case 'b', don't ask me why, that was the CEO's idea) is an application I have been working on with a small team for the past few months.

We are still in the prototype phase.

I am primarily responsible for the server side and the web client.

We are still a ways off from release, but things are moving forward and I have learned a lot from the project.

My latest effort on this project has been building an administrative portal and separating the web client from it's integration with Rails' sprockets.

Future Posts

With football (that is American Football for those outside the US) season rapidly approaching, to say I will be distracted is an understatement.

But I figure, I can do a lot of blogging while watching football.

Over the next few weeks I plan to talk about the following topics/experiences:

  • How I implemented theming in Pain Logger
  • The sync solution I used for Pain Logger
  • My experience in converting 'be' to a NodeJS build tool chain from Rails sprockets
  • The homegrown 'Flux' implementation I used for Pain Logger

My latest hurdle

Finally, I want to talk about one of those 'duh' moments I had recently.

You know the story, you find a problem, you look at your code, you say to yourself 'this couldn't possibly failing', yet it is.

So here goes.

One of the new features I am adding to Pain Logger is support for GeoFencing.

Yeah, if you are thinking to yourself this crazy idiot just said he was rewriting his app in Swift and now he is also adding new features at the same time, HAS HE LOST HIS MIND!!!!???

You would be right.

In fact this isn't the only new feature I am adding to Pain Logger.

Looking back now, and recognizing I still haven't got the update out, is a pretty good indicator this was A VERY BAD IDEA.

Anyway, I digress.

It was fairly simple to get Geofencing going, from a coding point of view but testing was something different.

When I first deployed it on my iPad everything was fine. It worked perfectly, but on my iPhone, it didn't work. What!?

I tried a TON of different things, including deleting and reinstalling the app, turning cellular data on and off, and changing any setting I thought that might possibly be affecting this on my phone.

But to no avail.

I can't even begin to enumerate the amount of logging I added to the app to see what was going on.

No matter what I tried, my iPad (wifi only) would work, even when I wasn't connected to a network, and my cell phone, which is always connected, would not.

So I began 'googling'.

Eventually, after spending a lot of time on Stack Overflow I took a different tact and said 'If I was a user having this problem, how would I search for an answer?"

I really thought I was headed to the "Genius" bar.

That line of thought led me to an Apple support post and the ultimate solution.

It turns out there was one more setting (shocking I know :-/) that I hadn't changed nor had I run across.

It is found under Settings->Privacy->Location Services->System Services.

You have turn on Location-Based Alerts.

One of the reasons this was so hard to find for me was that the "System Services" option is found at the bottom of the list of ALL of the apps you have installed.

One would have thought a standard option that is never removed would have been found first in the list, but nope.

Yeah, it makes sense now, and how this setting ever got turned off, I'll never know but another speed bump overcome, even if it was stupidity on my part.

Till next time.

Sunday, June 5, 2016

Devise Two-Step

DeviseTwoStep

Finally

Let me first say I am not a Devise expert by any stretch of the imagination. The project I am working on has a requirement to use two different models for authentication and I could not find a complete example of how to setup Devise to do this.

The documentation and several web sites and stack overflow articles I found gave hints on how to do it but I still had questions. So I thought I would write this post to show how I setup a test project to do it from end to end.

My goal is to have two models, user and admin that both are authenticated using Devise. Each model (I'll call it an account from here on out when referring to both) will have a set of pages that are only accessed when the account is logged in.

Additionally these protected pages are only accessible by one of the two accounts (depending on who is loved in). Finally, each account will have it's own (and separate) sign-in page and layout.

Much of the advice on the net recommended using a role based approach to solve this problem, but for many reasons that won't work for the app I will eventually apply this to. So a role based solution is off the table.

I started this investigation at the Devise website{***}. Unfortunately, I could not follow it enough to get a successful implementation. It felt like I was missing something.

The problem I kept running into was I would login as an Admin and when I logged out I would get redirected to the User's login page or if I logged in as the User I could then get to the Admin protected pages when it shouldn't. So something just wasn't clicking.

I next worked through this tutorial with the following deviations:

  • When I copied the devise views into the app I put them in their own directories with this command:
`rails generate devise:views User`
  • I did not do the section on Sending E-Mail and DelayedJob as they were not germane to the problem I was trying to solve

Once I had that going I created my Admin model and migrated my database.

rails generate devise Admin
rake db:migrate

Note: I did not add a name attribute to the Admin model like was done in the tutorial for the User model, as there was no need for it. I also left the registration support in so I could quickly add admin users, but obviously in a real app you wouldn't want to allow that.

I next created the Devise views for the Admin model (I had already copied the user ones when working through the tutorial):

rails generate devise:views Admin

This creates two directories under app/views, users and admins respectively. Under each directory will be sub directories for the different devise features that are supported.

Next I created sub directories in the controllers directory called users and admins and in those sub directories I created a controller for registration and session. This has to be done in order to customize registration and authentication for each of the models.

So for the User side I created a directory under controllers called ... wait for it ... users.

The registration controller (named registrations_controller.rb) looks like this:

class Users::RegistrationsController < Devise::RegistrationsController
  # disable default no_authentication action
  skip_before_action :require_no_authentication, only: [:new, :create, :cancel]
  
  protected

  def sign_up(resource_name, resource)
    # just overwrite the default one
    # to prevent auto sign in as the new sign up
  end
end

The session controller (named sessions_controller.rb) looks like this:

class Users::SessionsController < Devise::SessionsController
  # disable default no_authentication action
  skip_before_action :require_no_authentication, only: [:new, :create, :cancel]
end

Next is the registration controller for the Admin model (same name as above):

class Admins::RegistrationsController < Devise::RegistrationsController
  # disable default no_authentication action
  skip_before_action :require_no_authentication, only: [:new, :create, :cancel]
  
  protected

  def sign_up(resource_name, resource)
    # just overwrite the default one
    # to prevent auto sign in as the new sign up
  end
end

and the session controller for Admin (same name as above):

class Admins::SessionsController < Devise::SessionsController
  # disable default no_authentication action
  skip_before_action :require_no_authentication, only: [:new, :create, :cancel]
  # now we need admin to register new admin
  #prepend_before_action :authenticate_scope!, only: [:new, :create, :cancel]

  protected


  # def sign_up(resource_name, resoure)
  #   # just overwrite the default one
  #   # to prevent auto sign in as the new sign up
  # end
end

We are almost there I promise. Next are the changes to the routes.rb file:

Rails.application.routes.draw do
#  devise_for :admins
  devise_for :admins, module: 'admins', controllers: {sessions: 'admins/sessions', registrations:'admins/registrations'}
#  devise_for :users
  devise_for :users, module: 'users', controllers: {sessions: 'users/sessions', registrations:'users/registrations'}

  # These are the protected routes.
  # The pages controller is for the user model and the
  # admin_pages is for the admin model
  get '/secret', to: 'pages#secret', as: :secret
  get '/adminsecret', to: 'admin_pages#secret', as: :adminsecret
  get '/userhome', to: 'pages#index'
  get '/adminhome', to: 'admin_pages#index'

  # Define the root for when a user is authenticated
  authenticated :user do
    root 'pages#index', as: :authenticated_user_root
  end

  # define the root for when an admin is authenticated
  authenticated :admin do
    root 'admin_pages#index', as: :authenticated_admin_root
  end

  # default root (should never use this)
  root to: 'pages#index'
end

When you run the devise generator it will add a devise_for call in your routes. I have left those in (but commented out) to show what the default is so you can see how I adjusted the devise routes for the different models to point to their respective controllers.

Also you can see the "secret" pages and how they are defined as regular routes. I'll show them shortly. Finally I defined scoped routes for the "roots" for the different models.

Next is the application controller. This is where the "glue/magic" happens. I have included comments in the code so you can see what I was doing/thinking.

class ApplicationController < ActionController::Base
  protect_from_forgery with: :exception
  before_action :configure_permitted_parameters, if: :devise_controller?

  respond_to :html, :json

  # the layout should be specified by the resource (i.e. admin or user)
  layout :layout_by_resource

  protected

  # ensure name is allowed
  def configure_permitted_parameters
    devise_parameter_sanitizer.for(:sign_up) << :name  
    devise_parameter_sanitizer.for(:account_update) << :name
  end

  # if we are using a devise controller then we can check the
  # resource_name and return the appropriate layout, this can 
  # also be done at the controller level
  def layout_by_resource
    if devise_controller? && resource_name == :admin
      'adminlayout'
    elsif devise_controller? && resource_name == :user
      'userlayout'
    else
      'application'
    end
  end

  # Specify where to go after successful login again this is 
  # dictated by the resource that logged in
  def after_sign_in_path_for(resource)
    if devise_controller? && resource_name == :admin
      authenticated_admin_root_path
    elsif devise_controller? && resource_name == :user
      authenticated_user_root_path
    else
      root_path
    end
  end

  # Specify where to go after successful logout again this is 
  # dictated by the resource that logged in
  def after_sign_out_path_for(resource)
    if devise_controller? && resource_name == :admin
      new_admin_session_path
    elsif devise_controller? && resource_name == :user
      new_user_session_path
    else
      'application'
    end
  end

end

Next up is the protected content. For the user, he is directed to the pages index when properly authenticated. For the admin, she is directed to the admin_pages index when successfully authenticated. The controller for pages looks like this:

class PagesController < ApplicationController
  before_action :authenticate_user!

  layout 'userlayout'
end

By adding the before_action to the controller all routes will be forced to be authenticated.

The controller for the pages accessible to the admin look very similar:

class AdminPagesController < ApplicationController
  before_action :authenticate_admin!
  layout 'adminlayout'
end

Again all routes in this controller are protected by the before_action but this time the session must be a logged in admin.

With that said we are done. I modified the various views to indicate which one was being shown when you loaded it up. Check the completed code here here for what those views look like.

In summary, I guess I was just too dense to understand it but the key to all of this turned out to be how the application controller is configured.