iPhone X and Why I’m (sort of) Ignoring the HIG

iPhone X and Why I’m (sort of) Ignoring the HIG

What Apple Wants

The iPhone X has been out now for several weeks, and slowly but surely, I’m seeing app compatibility updates hitting the store. According to Apple and the HIG, the message is a loud and clear, “embrace the notch!” But will designers and developers follow those instructions? Immediately after the initial announcement, there were plenty of ideas (and jokes) about how to deal with that pesky notch. I’ve been a bit slow on getting my metronome app Click ready for the iPhone X. I ended up making some design choices that I thought might make for an interesting blog post, sort of striking a middle ground between the letter of the “law” in the HIG and going completely wild by totally hiding or calling attention to the notch. These decisions were also influenced by the existing style and purpose of the app, so I’ll talk briefly about those factors as well. So, without further ado, let’s dive in!

Here are the most important official summary statements from Apple about handling the “key display features”:

Avoid explicitly placing interactive controls at the very bottom of the screen and in corners. People use swipe gestures at the bottom edge of the display to access the Home screen and app switcher, and these gestures may cancel custom gestures you implement in this area. The far corners of the screen can be difficult areas for people to reach comfortably.

Don’t mask or call special attention to key display features. Don’t attempt to hide the device’s rounded corners, sensor housing, or indicator for accessing the Home screen by placing black bars at the top and bottom of the screen. Don’t use visual adornments like brackets, bezels, shapes, or instructional text to call special attention to these areas, either.

My Design

So, will people follow the rules? How flexible will app review be with apps that push the boundaries? It certainly appears that many people do not like the notch, as evidenced by such sites as notchless.space for creating notch-hiding backgrounds! And, what did I decide to do with my design? I’ll include the current preview video of Click from before iPhone X, as a little intro to the design of the app as it already was. A few things probably jump out to you right away as potentially interesting on iPhone X: the completely black background, and the presence of a LOT of elements laid out way into the corners of the display.


Hiding the Notch?

Continue reading “iPhone X and Why I’m (sort of) Ignoring the HIG”


Photoshop Layer Comps

Just a quick Photoshop tip today, but it’s something I’ve been making extensive use of the last few weeks, so I thought I’d share. If you happened to read my last post and/or watch the video, you would have seen that in my new metronome app, I’m handling interface rotation in a somewhat different way than most apps. Rather than using the standard system autorotations – using the usual springs and struts in Interface Builder or the UIView’s autoresizingMask property – I’m leaving the basic layout of the controls the same and just rotating the contents. It’s kind of hard to describe, so if that doesn’t make sense, skip to about the 10:00 mark on this video.

Here’s the gist of the code to make this happen:

  • In the main ViewController’s shouldAutorotate method I only return YES for  UIInterfaceOrientationLandscapeRight, the same orientation the app launches in. Meaning, the view controller will not do any auto-rotating once it’s loaded into its initial state.
  • I’ve registered for the UIDeviceOrientationDidChangeNotification. Even though the View Controller will not do anything automatically when the device is rotated, the system will still generate these notifications when the orientation changes.
  • When I receive this notification, I pass the message along, and the individual views apply whatever sort of rotation transform they need to in order to remain “right side up.”
  • If the Status Bar is visible, you can also programmatically set its orientation with:[[UIApplication sharedApplication] setStatusBarOrientation:(UIInterfaceOrientation)orientation animated:YES];

What this means from a design perspective, is that the UIImageViews themselves, which contain the main interface chrome, do NOT rotate at all. So, here on the right is what the main control frame looks in the launch orientation – notice the shadows, gradients, etc. all use the “canonical” iOS 90 degree light source.

Let’s say the user then rotates to LandscapeLeft – my subviews will rotate themselves, but the image will stay exactly the same. The image on the left is the same, but rotated 180 degrees. It’s strange how much different – and more noticeable – the light/shadow/gradient effects are when they’re flipped around the wrong way!

So, in order to maintain the right look, what I need to do is create separate images for each orientation and load these in as part of my custom rotation handling. Here’s where Photoshop layer comps come in. What they let you do is take snapshots of certain aspects of your document state and then reload them with one click. For example, in my case, I’ve set up one Layer Comp for each of the four orientations I’ll support. Here’s the workflow:

  • Setup the document for the current orientation. In the case of LandscapeRight, that means 90 degree light sources for all drop shadows, gradients that go light to dark from top to bottom, etc.
  • In the Layer Comps window – add it to the toolbar from the Window menu if you don’t see it – select New Layer Comp from the pulldown menu.
  • In the dialogue box that opens, give your comp a name, select which parts of the document state you want to be saved as part of the snapshot, and add any helpful comments you might have. 
  • For this particular case, I’ve told the Layer Comp to only save the Layer Styles of the document’s layers.
  • Repeat the process for each orientation, setting the light sources, gradients, etc. on the Layer Styles, and then saving it as a new Layer Comp.

By using vector/smart objects and layer styles – you are doing that aren’t you? – the exact same set of objects and layers is used for every orientation. I’m free to adjust the positioning, size, and shape of the objects, and then, when it comes time to export for each orientation, I just click through the four Layer Comps one by one, and all my light and shadow effects are applied instantly to all objects. It takes a bit of work to setup, but once it’s ready, it saves huge amounts of time over going to each object individually and resetting the properties every time I want to make a change in the design and re-export for each orientation. For things like the “Tap,” “+,” and “-” labels, and for different button states, I also have a set of Layer Comps which control layer visibility. So, for example, if I need to re-export the image for the “pressed” tap button – I hit the Layer Comp for the orientation I want, which loads the correct layer styles, then hit the “Tap Button Pressed” layer comp which won’t affect the layer styles, but will hide the normal Tap button layers and show the pressed ones. Two clicks and I’m ready to export. So, that’s how I’ve been using Layer Comps in my particular case to speed up my design workflow – hopefully it gives you some ideas for how you might be able to use them in your own workflow!