Remove Nth Node From End of List

Problem

Given the head of a linked list, remove the nth node from the end of the list and return its head.

Example 1:

Input: head = [1,2,3,4,5], n = 2
Output: [1,2,3,5]

Example 2:

Input: head = [1], n = 1
Output: []

Example 3:

Input: head = [1,2], n = 1
Output: [1]

Constraints:

  • The number of nodes in the list is sz.
  • 1 <= sz <= 30
  • 0 <= Node.val <= 100
  • 1 <= n <= sz

Follow up: Could you do this in one pass?

Solution

This problem was a bit difficult for me.

/**
 * Definition for singly-linked list.
 * public class ListNode {
 *     int val;
 *     ListNode next;
 *     ListNode() {}
 *     ListNode(int val) { this.val = val; }
 *     ListNode(int val, ListNode next) { this.val = val; this.next = next; }
 * }
 */
class Solution {
    public ListNode removeNthFromEnd(ListNode head, int n) {
        var fast = head;
        ListNode slow = null;
        var i = 0;

        while (fast != null) {
            i += 1;
            fast = fast.next;
            if (i > n) {
                if (slow == null) {
                    slow = head;
                } else {
                    slow = slow.next;
                }
            }
        }

        if (slow == null) {
            return head.next;
        }

        if (slow.next.next == null) {
            slow.next = null;
        } else {
            slow.next.val = slow.next.next.val;
            slow.next.next = slow.next.next.next;
        }

        return head;
    }
}

Recent posts from blogs that I like

Paintings of the Franco-Prussian War: 2 The Siege of Paris

As winter grew colder, Parisians started to starve. A city known for its food and restaurants had to scavenge meals based on horse, dog, cat and even rat.

via The Eclectic Light Company

Impromptu disaster recovery

via fasterthanlime

Notes on implementing Attention

Some notes on implementing attention blocks in pure Python + Numpy. The focus here is on the exact implementation in code, explaining all the shapes throughout the process. The motivation for why attention works is not covered here - there are plenty of excellent online resources explaining it. Seve...

via Eli Bendersky