CodeGym /Java Blog /Random /The Q&A from job interviews: algorithms in Java, part...
Konstantin
Level 36
Odesa

The Q&A from job interviews: algorithms in Java, part 1

Published in the Random group
Development projects use various algorithms more often than you might think. For example, suppose we need to sort some data by certain parameters (columns) so we can navigate through the data without much effort. So it wouldn't be strange at all for a job interviewer to ask you about a specific basic algorithm and perhaps give the task of implementing it using code. The Q&A from job interviews: algorithms in Java, part 1 - 1And since you're on this website, I'll be so bold as to assume that you're writing in Java. That's why today I suggest that you familiarize yourself with some basic algorithms and with specific examples of how to implement them in Java. By "some", I mean:
  1. Overview of array sorting algorithms:
    • bubble sort,
    • selection sort,
    • insertion sort,
    • Shell sort,
    • quicksort,
    • merge sort,
  2. Greedy algorithms
  3. Pathfinding algorithms
    • depth-first search
    • breadth-first search
  4. Dijkstra's Shortest Path First algorithm
Well, without further ado, let's get down to business.

1. Overview of sorting algorithms

Bubble sort

This sorting algorithm is known primarily for its simplicity, but it is also one of the slowest. As an example, let's consider a bubble sort for numbers in ascending order. Let's imagine a random sequence of numbers. We'll perform the following steps on these numbers, starting from the beginning of the sequence:
  • compare two numbers;
  • if the number on the left is larger, then swap them;
  • move one position to the right.
After performing these steps on the entire sequence, we will find that the largest number is at the end of our series of numbers. Then we make another pass over the sequence, executing exactly the same steps as above. But this time we won't include the last element of the list, since it is the largest number and already precisely where it should be when the numbers are sorted. Once again, we'll end up moving the next largest number to the end of our sequence. Of course, that means the two largest numbers are standing in their proper places. Again, we make passes over the sequence, excluding the elements that are already in their places, until all the elements are in the required order. Let's take a look at how bubble sort is implemented in Java code:

public class Solution {
   public static void main(String[] args) {
    int[] testArr = new int[]{6,3,8,2,6,9,4,11,1};
    bubbleSort(testArr);
       for (int i : testArr) {
           System.out.println(i);
       }
   }

   public static void  bubbleSort(int[] array) {
       for(int i = array.length -1; i > 1; i--) {
         for (int j = 0; j < i; j++) { //
             if (array[j] > array[j+1]) {
                 int temp = array[j];
                 array[j] = array[j+1];
                 array[j+1] = temp;
             }
         }
       }
   }
}
As you can see, there is nothing complicated here. Everything seems just great and it would be if not for one shortcoming — bubble sort is very, very slow. It's time complexity is O(N²), since we have nested loops. The outer loop over the elements is performed N times. The inner loop is also executed N times. As a result, we get N*N, or N², iterations.

Selection sort

This algorithm is similar to bubble sort, but it works a little faster. Again, as an example, let's take a sequence of numbers that we want to arrange in ascending order. The essence of this algorithm is to sequentially iterate through all the numbers and select the smallest element, which we take and swap with the leftmost element (the 0th element). Here we have a situation similar to bubble sort, but in this case our sorted element will be the smallest. Therefore, the next pass through the elements will start from the element with index 1. We'll repeat these passes until all the elements have been sorted. Implementation in Java:

public class Solution {
   public static void main(String[] args) {
       int[] testArr = new int[]{6, 3, 8, 2, 6, 9, 4, 11, 1};
       sortBySelect(testArr);
       for (int i : testArr) {
           System.out.println(i);
       }
   }

   public static void sortBySelect(int[] array) {

       for (int i = 0; i < array.length-1; i++) { // An ordinary outer loop
           int min = i;

           for (int j = i + 1; j < array.length; j++) { // An ordinary loop, but one that accounts for the sorted numbers
               if (array[j] < array[min]) {
                   min = j;
               }
           }
           int temp = array[i];     // Put the sorted number in the proper cell
           array[i] = array[min];
           array[min] = temp;
       }
   }
}
This algorithm is superior to bubble sort, because here the number of required shifts is reduced from O(N²) to O(N). We aren't driving one element through the entire list, but the number of comparisons is still O(N²).

Insertion sort

We consider yet another number sequence that we want to arrange in ascending order. This algorithm consists in placing a marker, where all of the elements to the left of the marker are already partially sorted among themselves. At each step of the algorithm, one of the elements will be selected and placed at the desired position in the partially sorted sequence. Thus, the sorted part will grow until all the elements have been examined. Are you wondering how you get the subset of elements that are already sorted and how we determine where to put the marker? But the array comprised of the first element is already sorted, isn't it? Let's take a look at the implementation in Java:

public class Solution {
   public static void main(String[] args) {
       int[] testArr = new int[]{6, 3, 8, 8, 6, 9, 4, 11, 1};
       insertionSort(testArr);
       for (int i : testArr) {
           System.out.println(i);
       }
   }

   public static void insertionSort(int[] array) {

       for (int i = 1; i < array.length; i++) { // i is the dividing marker
           int temp = array[i]; // Make a temporary copy of the marked element
           int j = i;
           while (j 	> 0 && array[j - 1] >= temp) { // Until a smaller element is found
               array[j] = array[j - 1]; // We shift the element to the right
               --j;
           }
           array[j] = temp;   // Insert the marked element in its proper place
       }
   }
}
This type of sort is superior to the ones described above, because despite the fact that it has the same O(N²) running time, this algorithm is twice as fast as bubble sort and slightly faster than selection sort.

Shell sort

This sort is essentially a modified insertion sort. What am I talking about? Let's put first things first. We must first choose an interval. There are many approaches to making this choice. We won't go into too much detail about this. Let's divide our array in half and get some number — this will be our interval. So, if we have 9 elements in our array, then our interval will be 9/2 = 4.5. We discard the fractional part and get 4, since array indices can only be integers. We will use this interval to form our groups. If an element has index 0, then the index of the next element in its group is 0+4, that is, 4. The third element will have the index 4+4, the fourth — 8+4, and so on. In the second group, the first element will be 1,5,9... In the third and fourth groups, the situation will be the same. As a result, from the number array {6,3,8,8,6,9,4,11,1} we get four groups: I — {6,6,1} II — {3,9} III — {8,4} IV — {8,11} They retain their places in the general array, but we have marked as members of the same group: {6,3,8,8,6,9,4,11,1} Next, insertion sort is applied to these groups, and then they look like this: I — {1,6,6} II — {3,9} III — {4,8} IV — {8,11} In the general array, the cells occupied by the groups will remain the same, but their internal order will change, according to the order of the groups above: {1,3,4,8,6,9,8,11,6} The array has become a little more ordered, hasn't it? The next interval will be divided by 2: 4/2 = 2 We have two groups: I — {1,4,6,8,6} II — {3,8,9,11} In the general array, we have: {1,3,4,8,6,9,8,11,6} We run the insertion sort algorithm on both groups, and get this array: {1,3,4,8,6,9,6,11,8} Now our array is almost sorted. We need to perform a final iteration of the algorithm: we divide the interval by 2: 2/2 = 1. We get a group comprised of the entire array: {1,3,4,8,6,9,6,11,8} Running the insertion sort algorithm on that, we get: {1,3,4,6,6,8,8,9,11} Let's take a look at how we can bring this sort to life in Java code:

public class Solution {
   public static void main(String[] args) {
       int[] testArr = new int[]{6, 3, 8, 8, 6, 9, 4, 11, 1};
       sortBySelect(testArr);
       for (int i : testArr) {
           System.out.println(i);
       }
   }

   public static void sortBySelect(int[] array) {
       int length = array.length;
       int step = length / 2;
       while (step > 0) {
           for (int numberOfGroup = 1; numberOfGroup < length - step; numberOfGroup++) { // We pass over all of our groups
              int j = numberOfGroup;
               while (j >= 0 && array[j] > array[j + step]) { // Insertion sort inside the group
                   int temp = array[j];
                   array[j] = array[j + step];
                   array[j + step] = temp;
                   j--;
               }
           }
           step = step / 2; // Shrink the interval
       }
   }
}
At the moment, Shellsort's performance is not easy to characterize, since the results differ in different situations. Experimental estimates range from O(N3/2) to O(N7/6).

Quicksort

This is one of the most popular algorithms, so it is worth paying special attention to. The gist of this algorithm is that a pivot element is selected in a list of elements. We sort all the other elements relative to the pivot element. Values less than the pivot element are on the left. Values greater than it are on the right. Next, pivot elements are also selected in the right and left parts, and the same thing happens: the values are sorted relative to these elements. Then pivot elements are selected in the newly formed parts, and so on until we get a sorted sequence. The following Java implementation of this algorithm uses recursion:

public class Solution {
   public static void main(String[] args) {
       int[] testArr = new int[]{6, 3, 8, 8, 6, 9, 4, 11, 1};
       fastSort(testArr);
       for (int i : testArr) {
           System.out.println(i);
       }
   }

   public static void fastSort(int[] array) {
       recursionFastSort(array, 0, array.length - 1);
   }


   public static void recursionFastSort(int[] array, int min, int max) {
       if (array.length == 0) // Condition for exiting recursion if the array length is 0
           return;

       if (min> = max) // Terminate the recursion, since there is nothing to divide
           return;


       int middle = min + (max - min) / 2; // Select the middle
       int middleElement = array[middle];


       int i = min, j = max;
       while (i <= j) { // Every element less than the middle element will be to the left, and large ones will be to the right
           while (array[i] < middleElement) {
               i++;
           }
           while (array[j] > middleElement) {
               j--;
           }

           if (i <= j) { // Swap places
               int temp = array[i];
               array[i] = array[j];
               array[j] = temp;
               i++;
               j--;
           }
       }

       if (min < j) // Make a recursive call on the elements that are less than middle
           recursionFastSort(array, min, j);

       if (max > i) // Make a recursive call on the elements larger than middle
           recursionFastSort(array, i, max);
   }
}
Without a doubt, the quicksort algorithm is the most popular, since in most situations it runs faster than others. Its time complexity is O(N*logN).

Merge sort

This sort is also popular. It is one of many algorithms that relies on the principle of "divide and conquer". Such algorithms first divide the problem into manageable parts (quicksort is another example of such an algorithm). So what's the gist of this algorithm?

Divide:

The array is split into two parts of approximately the same size. Each of these two parts is divided into two more, and so on until the smallest possible indivisible parts remain. We have the smallest indivisible parts when each array has one element, i.e. an array that is already sorted.

Conquer:

This is where we begin the process that gave the algorithm its name: merge. To do this, we take the two resulting sorted arrays and merge them into one. In this case, the smallest of the first elements of the two arrays is written into the resulting array. This operation is repeated until all the elements in these two arrays have been copied over. That is, if we have two minimal arrays {6} and {4}, we compare their values and generate this merged result: {4,6}. If we have sorted arrays {4,6} and {2,8}, we first compare the values 4 and 2, and then we write 2 into the resulting array. After that, 4 and 8 will be compared, and we'll write 4. Finally, 6 and 8 will be compared. Accordingly, we'll write 6, and only after that will we write 8. As a result, we get the following merged array: {2,4,6,8}. How will this look in Java code? To run this algorithm, it will be convenient for us to use recursion:

public class Solution {
   public static void main(String[] args) {
       int[] testArr = new int[]{6, 3, 8, 8, 6, 9, 4, 11, 1};
       testArr = mergeSort(testArr);
       for (int i : testArr) {
           System.out.println(i);
       }
   }

   public static int[] mergeSort(int[] array1) {
       int[] sortArr = Arrays.copyOf(array1, array1.length); // Array for sorting
       int[] bufferArr = new int[array1.length];// Buffer array
       return recursionMergeSort(sortArr, bufferArr, 0, array1.length);
   }


   public static int[] recursionMergeSort(int[] sortArr, int[] bufferArr,
                                          int startIndex, int endIndex) {
       if (startIndex> = endIndex - 1) { // Return the array when there is only one element left in the array range under consideration
           return sortArr;
       }

       // Make a recursive call to get two sorted arrays:
       int middle = startIndex + (endIndex - startIndex) / 2;
       int[] firstSortArr = recursionMergeSort(sortArr, bufferArr, startIndex, middle);
       int[] secondSortArr = recursionMergeSort(sortArr, bufferArr, middle, endIndex);

       // Merge the sorted arrays:
       int firstIndex = startIndex;
       int secondIndex = middle;
       int destIndex = startIndex;
       int[] result = firstSortArr == sortArr ? bufferArr : sortArr;
       while (firstIndex < middle && secondIndex < endIndex) {
           result[destIndex++] = firstSortArr[firstIndex] < secondSortArr[secondIndex]
                   ? firstSortArr[firstIndex++] : secondSortArr[secondIndex++];
       }
       while (firstIndex < middle) {
           result[destIndex++] = firstSortArr[firstIndex++];
       }
       while (secondIndex < endIndex) {
           result[destIndex++] = secondSortArr[secondIndex++];
       }
       return result;
   }
}
As in quick sort, we move the recursive method into an intermediate method so that the user only needs to supply the array to be sorted and doesn't have to worry about providing any additional default arguments. This algorithm has similarities with quicksort, and unsurprisingly its execution speed is the same: O(N*logN).

2. Greedy algorithms

A greedy algorithm is an approach where locally optimal decisions are made at each stage, with the assumption that that the final solution will also be optimal. The "optimal" solution will be the one that offers the most obvious and immediate benefit at any particular step/stage. To explore this algorithm, let's take a fairly common problem — the knapsack problem. Pretend for a moment that you are a thief. You've broken into a store at night with a knapsack (backpack). In front of you are several goods that you could steal. But at the same time, your knapsack has limited capacity. It can carry no more than 30 units of weight. You also want to carry away the most valuable set of goods that will fit into the backpack. How do you determine what to put into your bag? So, the greedy algorithm for the knapsack problem consists of the following steps (we assume that all the items are being placed in the knapsack):
  1. Choose the most expensive item that hasn't been taken yet.
  2. If it fits in the knapsack, put it in. If not, then leave it.
  3. Have we already stolen everything? If not, we return to step 1. If yes, then we make our quick getaway from the store, since we have accomplished what we came to do.
Let's look at this, but in Java. This is how the Item class will look:

public class Item implements Comparable {
   private String name;
   private int weight;
   private int value;

   public Item(String name, int weight, int value) {
       this.name = name;
       this.weight = weight;
       this.value = value;
   }

   public String getName() {
       return name;
   }

   public int getWeight() {
       return weight;
   }

   public int getValue() {
       return value;
   }

   @Override
   public int compareTo(Item o) {
       return this.value > o.value ? -1 : 1;
   }
}
There is nothing special here: three fields (name, weight, value) that define the characteristics of the item. Also, as you can see, the Comparable interface is implemented to allow us to sort our Items by price. Next, we'll look at the Bag class, which represents our knapsack:

public class Bag {
   private final int maxWeight;
   private List<Item> items;
   private int currentWeight;
   private int currentValue;

   public Bag(int maxWeight) {
       this.maxWeight = maxWeight;
       items = new ArrayList<>();
       currentValue = 0;
   }

   public int getMaxWeight() {
       return maxWeight;
   }

   public int getCurrentValue() {
       return currentValue;
   }

   public int getCurrentWeight() {
       return currentWeight;
   }

   public void addItem(Item item) {
       items.add(item);
       currentWeight += item.getWeight();
       currentValue += item.getValue();
   }
}
  • maxWeight is our backpack's capacity, which is set when we create an object;
  • items represents the objects in our backpack;
  • currentWeight, currentValue — these fields store the current weight and value of all the items in the backpack, which we increase when we add a new item in the addItem method.
Anyway, let's now go to the class where all the action takes place:

public class Solution {

   public static void main(String[] args) {
       List<Item> items = new ArrayList<>();
       items.add(new Item("Guitar", 7, 800));
       items.add(new Item("Iron", 6, 500));
       items.add(new Item("Tea pot", 3, 300));
       items.add(new Item("Lamp", 4, 500));
       items.add(new Item("Television", 15, 2000));
       items.add(new Item("Vase", 2, 450));
       items.add(new Item("Mixer", 2, 400));
       items.add(new Item("Blender", 3, 200));

       Collections.sort(items);

       Bag firstBag = new Bag(30);

       fillBackpack(firstBag, items);

       System.out.println("Knapsack weight is " + firstBag.getCurrentWeight() +
               ". The total value of items in the knapsack is " + firstBag.getCurrentValue());
}
} 
First, we create a list of items and sort it. We create a bag object with a 30-unit capacity. Next, we pass the items and the bag object to the fillBackpack method, which fills the knapsack according to our greedy algorithm:

public static void fillBackpack(Bag bag, List<Item> items) {
   for (Item item : items) {
       if(bag.getMaxWeight() > bag.getCurrentWeight() + item.getWeight()) {
            bag.addItem(item);
       }
   }
}
It's pretty simple: we start going through a list of items sorted by cost and putting them into the bag if its capacity allows. If there isn't enough room, then the item will be skipped and we will continue to traverse over the rest of the items until we reach the end of the list. Once we run main, here's the console output we'll get:
The knapsack's weight is 29. The total value of items in the knapsack is 3700
This is an example of a greedy algorithm: at each step, a locally optimal solution is selected, and in the end, you get a globally optimal solution. In our case, the best option is the most expensive item. But is this the best solution? Don't you think it's possible to improve our solution slightly in order to fill our backpack with items that have even greater total value? Let's take a look at how this can be done.

public static void effectiveFillBackpack(Bag bag, List items) {
   Map<Double, Item> sortByRatio = new TreeMap(Collections.reverseOrder());
   for (Item item : items) {
       sortByRatio.put((double)item.getValue() / item.getWeight(), item);
   }

   for (Map.Entry<Double, Item> entry : sortByRatio.entrySet()) {
       if(bag.getMaxWeight() > bag.getCurrentWeight() + entry.getValue().getWeight()) {
           bag.addItem(entry.getValue());
       }
   }
}
Here we first calculate the value-to-weight ratio for each item. This tells us the value of each unit of a given item. And then we use these ratios to sort our items and add them to our bag. Let's run the following:

Bag secondBag = new Bag(30);

effectiveFillBackpack(secondBag, items);

System.out.println("The weight of the knapsack is " + secondBag.getCurrentWeight() +
       ". The total cost of items in the knapsack is " + secondBag.getCurrentValue());
We get this console output:
The weight of the knapsack is 29. The total cost of items in the knapsack is 4150
A little bit better, isn't it? The greedy algorithm makes a locally optimal choice at every step, hoping that the final solution will also be optimal. This assumption is not always valid, but for many tasks, greedy algorithms do yield an optimum final solution. The time complexity of this algorithm is O(N). Pretty good, huh?
Comments
TO VIEW ALL COMMENTS OR TO MAKE A COMMENT,
GO TO FULL VERSION