Type inference
Up to this point in this book, each time you’ve seen a variable or constant declared it’s been accompanied by an associated type, like this:
let integer: Int = 42
let double: Double = 3.14159
You may be asking yourself why you need to bother writing the : Int and : Double, since the right hand side of the assignment is already an Int or a Double. It’s redundant, to be sure; your crazy-clever brain can see this without too much work.
It turns out the Swift compiler can deduce this as well. It doesn’t need you to tell it the type all the time — it can figure it out on its own. This is done through a process called type inference. Not all programming languages have this, but Swift does, and it’s a key component of Swift’s power as a language.
So, you can simply drop the type in most places where you see one. For example, consider the following constant declaration:
let typeInferredInt = 42
Sometimes it’s useful to check the inferred type of a variable or constant. You can do this in a playground by holding down the Option key and clicking on the variable or constant’s name. Xcode will display a popover like this:
Xcode tells you the inferred type by giving you the declaration you would have had to use if there were no type inference. In this case, the type is Int.
It works for other types, too:
let typeInferredDouble = 3.14159
Option-clicking on this reveals the following:
You can see from this that type inference isn’t magic. Swift is simply doing what your brain does very easily. Programming languages that don’t use type inference can often feel verbose, because you need to specify the often obvious type each time you declare a variable or constant.
Note: In later chapters, you’ll learn about more complex types where sometimes Swift can’t infer the type. That’s a pretty rare case though, and you’ll see type inference used for most of the code examples in this book — except in cases where we want to highlight the type for you.
Sometimes you want to define a constant or variable and ensure it’s a certain type, even though what you’re assigning to it is a different type. You saw earlier how you can convert from one type to another. For example, consider the following:
let wantADouble = 3
Here, Swift infers the type of wantADouble as Int. But what if you wanted Double
instead?
The first thing you could do is the following:
let actuallyDouble = Double(3)
This is like you saw before with type conversion.
Another option would be to not use type inference at all and do the following:
let actuallyDouble: Double = 3
There is a third option, like so:
let actuallyDouble = 3 as Double
This uses a new keyword you haven’t seen before, as. It also performs a type conversion, and you will see this throughout the book.
Note: Literal values like 3 don’t have a type. It’s only when using them in an expression or assigning them to a constant or variable that Swift infers a type for them.
A literal number value that doesn’t contain a decimal point can be used as an Int as well as a Double. This is why you’re allowed to assign the value 3 to constant actuallyDouble.
Literal number values that do contain a decimal point cannot be integers. This means we could have avoided this entire discussion had we started with
let wantADouble = 3.0
Sorry! :]