publicstaticvoidmergeSort(int[] array){ if(array.length == 1){ //ending condition of the recursion return; } // seperate the current array to left and right intmid= array.length / 2; int[] left = newint[mid]; int[] right = newint[array.length - mid]; for(inti=0; i < mid; i++){ left[i] = array[i]; } for(intj= mid; j< array.length ; j++){ right[j - mid] = array[j]; } mergeSort(left); mergeSort(right);
intleftIndex=0; intrightIndex=0; intoverallIndex=0; //this is the index of the current array // use the length as the bound condition while(leftIndex < left.length && rightIndex < right.length){ // the condition has to include the <=, or it won't work. if(left[leftIndex] <= right[rightIndex]){ array[overallIndex++] = left[leftIndex ++]; } else { array[overallIndex++] = right[rightIndex ++]; } }
// Swap pivot to start swap(array, startIndex, pivotIndex);
inti= startIndex + 1; intj= endIndex;
// this while loop ensures that the order arround the pivot is correct. while (i <= j) { while (i <= j && array[i] <= pivot) i++; //if is already smaller, we move forward while (i <= j && array[j] > pivot) j--;// if is bigger, we move forward if (i <= j) { // if is in the wrong order according to the pivot, then we change their position. swap(array, i, j); i++; j--; } }
// Place pivot in correct position (at j) swap(array, startIndex, j);
// Recursive calls with updated indices quickSort(array, startIndex, j - 1); quickSort(array, j + 1, endIndex); }
privatestaticvoidswap(int[] array, int i, int j) { inttemp= array[i]; array[i] = array[j]; array[j] = temp; }
Why do we need recursion here? Why can’t we just use a for-loop and traverse everything?
Essentailly we need to sort the two seperate list on both left and right, for code simplicity, the best strategy is to use recursion.
Why do we need the pivotIndex to be randomly generated?
Because random number allow the existence of a relatively big left and right array, so that it would promote the advantage of using two pointer(i,j)
publicstaticvoidLSDRadixSort(int[] array) { if (array.length == 0) return;
// Initialize 10 buckets for digits 0-9. // The real algorithm has 19 buckets which implement 19 bucket accounting for the negatice numbers. finalintRADIX=10; Queue<Integer>[] buckets = newLinkedList[RADIX]; for (inti=0; i < RADIX; i++) { buckets[i] = newLinkedList<>(); }
// Find the maximum number to determine the number of passes (digits) intmax= array[0]; for (int num : array) { if (num > max) max = num; } intmaxDigit=0; while(max > 0){ max /= 10; maxDigit ++; } intbase=10; for (intdigitIndex=0; digitIndex < k; digitIndex++) { // Distribute elements into buckets based on the current digit for (int num : array) { intdivisor= base / 10; // Isolate the current digit (LSD first) intdigit= (num / divisor) % 10; buckets[digit].add(num); // Add to the corresponding bucket }
// Collect elements back into the array in order (0-9) intindex=0; for (Queue<Integer> bucket : buckets) { while (!bucket.isEmpty()) { array[index++] = bucket.poll(); // In the stack way for in-place sorting } }
base *= 10; // Move to the next digit (e.g., 10 → 100) } }
Why do we need 19 buckets?
From -9 to +9, though, we don’t care about the negative part in this implementation.
Why do we need the stack implementation for each bucket?
This is because we want an in-place implementation.
Probably the hardest method to implement in the open address hashmap is put method. We just going to implement this method simply. Also, we are not digging into the complex theory of choosing the correct hash function, so our hash function is just .hashcode % array.length.
public V remove(K key, V value){ //if the load factor is greater than the MAX_LOAD_FACTOR, then expand the list if((double)(size/table.length > MAX_LOAD_FACTOR){ resize(table.length * 2 + 1); } inthash= Math.abs(key.hashcode() % table.length); intfirstRemoved= -1; intnoInfinite=0; // to limit the loop to go for ever. while(table[hash] != null && noInfinite < size){ if(table[hash].equals(key)){ if(!isRemoved()){ // we got it VreturnValue= table[hash].getValue(); table[hash].setValue(value); return returnValue; }
} if(!table[hash].isRemoved()){ //this is a real element and //we need to check whether we have an infinite loop with noInfinite. noInfinite ++; } else { // we found the first removed place // but wait, we don't want duplicate key in the list, // so we only record the first Removed position. if(firstRemoved == -1){ fisrtRemoved = hash; } } // increase the hash; hash = (hash + 1)%table.length; } //out of the loop // make the final judgement. if (firstRemoved != -1) { table[firstRemoved] = newLinearProbingMapEntry<>(key, value); ++size; returnnull; } table[hash] = newLinearProbingMapEntry<>(key, value); ++size; returnnull;