Java Collections

ArrayList Vs HashSet:

Load Factor:

The default load factor of an ArrayList is 0.75f. 

For example, the current capacity is 10. So, loadfactor = 10*0.75=7 while adding the 7th element array size will increase.
 
Growth Rate: current_size + current_size/2

Example:

package com.vasanth.java.collections;
import java.util.ArrayList;
import java.util.HashSet;
/**
 * 
 * @author Vasanth
 *
 */
public class ArrayListVsHashSet {

public static void main (String args[]) {

/* ArrayList
* - Implements List Interface
* - Baked by Array
* - Insertion order will be maintained
* - Duplicates are allowed
* - Null allowed , no restriction.
* - Index based. we can retrieve/remove object using index. get(index)/remove(index)

*/
ArrayList<Object> arrayList = new ArrayList<>();
System.out.println("===== ArrayList=====");

arrayList.add("hello");
arrayList.add(20d);
arrayList.add(1);
arrayList.add(10L);

arrayList.add(null);
arrayList.add(20d);
arrayList.add(1);
arrayList.add(10L);
arrayList.add(null);
System.out.println("arrayList : "+arrayList);
System.out.println("arrayList by index : "+arrayList.get(0));
arrayList.remove(2);
System.out.println("arrayList after remove : "+arrayList);
arrayList.add(2, "vasanth");
System.out.println("arrayList after add by index : "+arrayList);


/* HashSet
* - Implements Set Interface
* - Baked by HashMap
* - Unordered 
* - Duplicates are not allowed
* - Only one null allowed
* - Object based. we cant retrieve by index. Does not provide get()

*/
HashSet<Object> hashSet = new HashSet<>();
System.out.println("===== HashSet=====");
hashSet.add("hello");
hashSet.add(20);
hashSet.add(1);
hashSet.add(10);
hashSet.add(null);
hashSet.add(20);
hashSet.add(1);
hashSet.add(10);
hashSet.add(null);
System.out.println("hashSet : "+hashSet);

}

}

==============================================================

HashMap / Synchronized HashMap  / Concurrent Hashmap:

Load Factor:

For HashMap DEFAULT_INITIAL_CAPACITY = 16 and DEFAULT_LOAD_FACTOR = 0.75f it means that MAX number of ALL Entries in the HashMap = 16 * 0.75 = 12. When the thirteenth element will be added capacity (array size) of HashMap will be doubled!

Example:


package com.vasanth.java.collections;

import java.util.Collections;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;

public class HashMapDemo {
public static void main(String args[]) {
HashMap<String, String> emp = new HashMap<>();
emp.put("1", "Vasanth1");
emp.put("2", "Vasanth2");
emp.put("3", "Vasanth3");
System.out.println("HashMap");
emp.forEach((k,v)->{
System.out.println(k+"->"+v);
});
//Synchronized Map
/*
* - It locks the whole map
* - Allows inserting null as a key
* - Multiple threads can not access the map concurrently. Hence, the performance is relatively less than the ConcurrentHashMap
*/
Collections.synchronizedMap(emp);
//Concurrent Map
/*
* - It locks some portion of the map
* - Doesn’t allow inserting null as a key or value
* - Implements the ConcurrentMap interface. It uses Hashtable, underlined data structure.
* - By default, it allows 16 threads to read and write from the Map without any synchronization.
* - At a time any number of threads can perform retrieval operation.
*   But for updating in the object, the thread must lock the particular segment in which the thread wants to operate. This type of locking mechanism is known as Segment locking or bucket locking.
*   Hence, at a time16 update operations can be performed by threads   
*/
ConcurrentHashMap<String, String> cMap=new ConcurrentHashMap<>();
cMap.put("1", "Vasanth1");
cMap.put("2", "Vasanth2");
cMap.put("3", "Vasanth3");
}
}

==========================================================================

LinkedHashMap and LinkedHashSet:


The LinkedHashMap is just like HashMap with an additional feature of maintaining an order of elements inserted into it.

The LinkedHashSet is an ordered version of HashSet that maintains a doubly-linked List across all elements.

Comments

Popular posts from this blog

Java Design Patterns