Managing colors with extensions in Swift

May 12, 2019

A full-time developer on the web and part-time dabbler in the world of iOS development, I recently tried to create some color variables to use throughout a project in a single location. In most web development projects, you might put these inside a utils folder and import them wherever they are needed:

Define color variable:

// utils/colors.js

export const dark = '#151951' // Default for all text

Use the variable:

// components/container.js

import { dark } from '../utils/colors'

const Container = () => {
	return <p style=`color: ${dark}`></p> 

export default Container

But in iOS development with Swift, there’s a problem with this. Unlike on the web, you don’t explicitly declare what you are importing and exporting in each file. Instead, methods of classes just magically become available throughout the project. While convenient, how will others reading your code know where the methods are coming from?

We could use a Swift protocol and namespace all the utils in the same class, or do something a bit simpler which I prefer: prepend an _ before all extensions so we know they’re custom utility functions. The same example above would then look like this, using extensions in Swift:

The usage of hex values for colors in iOS development requires some extra computation with UIColor because Apple doesn’t accept hex values directly. See dozens of different implementations of this on StackOverflow. The following approach is borrowed from Jared Davidson, and the emoji logging idea from Andyy Hope’s Pretty in Print series.

Define color variable:

//  Colors.swift

import UIKit

extension UIColor {
	// Defined colors
  static let _dark = UIColor().hex("#151951")
  func hex(_ hex: String) -> UIColor {
    // Clean string
    var hexString = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
    // Remove hash
    if hexString.hasPrefix("#") {
      hexString.remove(at: hexString.startIndex)
    // Log warning if hex value is not 6 characters
    if hexString.count != 6 {
      print("⚠️ Hex value #\(hexString) is not 6 characters.
      Please enter a value with 6 characters.")
      return UIColor.white
    var rgb: UInt32 = 0
    Scanner(string: hexString).scanHexInt32(&rgb)
    return UIColor.init(
      red: CGFloat((rgb & 0xFF0000) >> 16) / 255.0,
      green: CGFloat((rgb & 0x00FF00) >> 8) / 255.0,
      blue: CGFloat(rgb & 0x0000FF) / 255.0,
      alpha: 1.0

Use the color variable:

// CustomLabel.swift

import UIKit

class CustomLabel: UILabel {
  required init?(coder aDecoder: NSCoder) {
    super.init(coder: aDecoder)
    self.textColor = UIColor._dark

As I’m rapidly realizing, things you can do simply on the web with JavaScript don’t often come for free in iOS with Swift. This may be due to the relative newness of the language, particular design choices made by the architects, or a mix of both.

While I know it’s not exactly helpful to apply web thinking to a native iOS context, I find it useful to compare and contrast implementations as I’m learning to write more idiomatic Swift. Hopefully, I’ll come out a better software developer at the end of the day.

Cheers! 🍻

Thanks for reading ❤️

If you're jazzed about this post, feel free to tweet this article 🐦

If I missed something, please do drop me a message and I'll fix it 🔨

Otherwise, read more articles! ✍️