Gate delay measures the “speed” of a circuit. It is the number of gates between the input and output along the critical path—the longest path.

Technically, any function can be implemented in two gate delays by encoding the truth table with a layer of AND and a layer of OR, but this is obviously impractical for a lot of inputs.