Home » Load Factor in HashMap

Load Factor in HashMap

by Online Tutorials Library

Load Factor in HashMap

The HashMap is one of the high-performance data structure in the Java collections framework. It gives a constant time performance for insertion and retrieval. There are two factors which affect the performance of the hashmap.

  • Initial Capacity
  • Load Factor

We have to choose these two factors very carefully while creating the HashMap object. Load Factor and initial capacity can be configured while we create a constructor of HashMap class, as shown below:

Initial Capacity of HashMap

The initial capacity of the HashMap is the number of buckets in the hash table. It creates when we create the object of HashMap class. The initial capacity of the HashMap is 24, i.e., 16. The capacity of the HashMap is doubled each time it reaches the threshold. The capacity is increased to 25=32, 26=64, and so on.

Suppose we have implemented the hashCode() method, which makes sure that key-value pair will be distributed among 16 buckets equally.

Consider the following scenarios:

  • If there are 16 elements in the HashMap, the hashCode() method will distribute one element in each bucket. The searching for any item, in this case, will take the only lookup.
  • If there are 32 elements in the HashMap, the hashCode() method will distribute two elements in each bucket. The searching for any item, in this case, will take the maximum of two lookups.
  • Similarly, if there are 128 elements in HashMap, the hashCode() method will distribute eight elements in each bucket. The searching for any item, in this case, will take the maximum eight lookups.

We can observe from the above scenarios that the number of items in HashMap is doubled. The maximum lookup time in each bucket is not increasing very high and remain almost constant.

Alternatively, the hashmap grows in the power of 2n and keep on growing when starting point it reached its limit.

Load Factor

The Load factor is a measure that decides when to increase the HashMap capacity to maintain the get() and put() operation complexity of O(1). The default load factor of HashMap is 0.75f (75% of the map size).

Problem

The problem is, keeping the bucket size fixed (i.e., 16), we keep on increasing the total number of items in the map that disturbs time complexity.

Solution

When we increase the total number of buckets, total items in each bucket starts increasing. Now we are able to keep the constant number of items in each bucket and maintain the time complexity of O(1) for get() and put() operation.

How Load Factor is calculated

Load Factor decides “when to increase the number of buckets.”

We can find when to increase the hashmap size by using the following formula:

The initial capacity of hashmap is=16
The default load factor of hashmap=0.75
According to the formula as mentioned above: 16*0.75=12

It represents that 12th key-value pair of hashmap will keep its size to 16. As soon as 13th element (key-value pair) will come into the Hashmap, it will increase its size from default 24 = 16 buckets to 25 = 32 buckets.

Another way to calculate size:

When the load factor ratio (m/n) reaches 0.75 at that time, hashmap increases its capacity.

Where,

      m is the number of entries in a hashmap.

      n is the total size of hashmap.

Example of Load Factor

Let’s understand the load factor through an example.

We know that the default bucket size of the hashmap is 16. We insert the first element, now check that we need to increase the hashmap capacity or not. It can be determined by the formula:

In this case, the size of the hashmap is 1, and the bucket size is 16. So, 1/16=0.0625. Now compare this value with the default load factor.

                    0.0625<0.75

So, no need to increase the hashmap size.

We do not need to increase the size of hashmap up to 12th element, because

                    12/16=0.75

This load factor is equal to the default load factor, i.e., 0.75.

As soon as we insert the 13th element in the hashmap, the size of hashmap is increased because:

                    13/16=0.8125

Which is greater than the default hashmap size.

                    0.8125>0.75

Now we need to increase the hashmap size.

If you want to keep get and put complexity O(1), it is advisable to have a load factor around 0.75.


Next TopicJava Tutorial

You may also like