MacBook is $599 now!

ALSO: Cache Eviction Policies

In partnership with

Welcome back!

This week’s coding challenge is easy but very popular with FAANG companies.

Today we will cover:

  • Majority Element

  • Cache Eviction Policies

Read time: under 4 minutes

CODING CHALLENGE

Majority Element

Given an array nums of size n, return the majority element.

The majority element is the element that appears more than ⌊n / 2⌋ times. You may assume that the majority element always exists in the array. 

Example 1:

Input: nums = [3,2,3]
Output: 3

Example 2:

Input: nums = [2,2,1,1,1,2,2]
Output: 2

Solve the problem here before reading the solution.

PRESENTED BY MINTLIFY

Ship Docs Your Team Is Actually Proud Of

Mintlify helps you create fast, beautiful docs that developers actually enjoy using. Write in markdown, sync with your repo, and deploy in minutes. Built-in components handle search, navigation, API references, and interactive examples out of the box, so you can focus on clear content instead of custom infrastructure.

Automatic versioning, analytics, and AI powered search make it easy to scale as your product grows. Your docs stay accurate automatically with AI-powered workflows with every pull request.

Whether you're a dev, technical writer, part of devrel, and beyond, Mintlify fits into the way you already work and helps your documentation keep pace with your product.

SOLUTION

To solve this problem, we can use the Boyer-Moore Voting Algorithm. This is a clever algorithm that uses the fact that the majority element appears more than n/2 times.

The algorithm works by keeping a count and a candidate. When we find the same element as the candidate, we increase the count. When we find a different element, we decrease the count.

If the count becomes 0, we pick a new candidate and reset count to 1. The key insight is that the majority element will always remain as the final candidate.

The time complexity of this solution is O(n), where n is the length of the array. The space complexity is O(1) since we only use two variables.

HEARD OF MINDSTREAM?

Turn AI Into Extra Income

You don’t need to be a coder to make AI work for you. Subscribe to Mindstream and get 200+ proven ideas showing how real people are using ChatGPT, Midjourney, and other tools to earn on the side.

From small wins to full-on ventures, this guide helps you turn AI skills into real results, without the overwhelm.

SYSTEM DESIGN

Cache Eviction Policies

Cache is a high-speed storage layer that sits between an application and its primary data store. It holds frequently accessed data to speed up reads. When a program needs to fetch data, it first checks the cache before going to the slower main storage.

Cache storage typically uses RAM which is costlier than the main storage on disc. Due to this cost constraint, cache size is usually limited. When this limited cache becomes full and we need to add new items, we must decide which existing items to remove. This is where cache eviction policies come into play.

The simplest approach to cache eviction is First-In-First-Out (FIFO). As the name suggests, FIFO removes the item that was added to the cache first. Think of it as maintaining a queue of cached items – when we need space, we remove the item at the front of the queue (the oldest one) and add the new item at the back. While FIFO is straightforward to implement, it has a significant drawback. Consider a scenario where a particular piece of data is accessed very frequently but happened to be cached first. Under FIFO, this frequently used item would be the first to be removed despite being the most useful item in the cache.

This limitation leads us to consider frequency of access as a metric for eviction, bringing us to the Least Frequently Used (LFU) policy. LFU keeps track of how many times each cached item has been accessed and removes the item with the lowest access count when space is needed. This works well for many scenarios, but LFU too has its shortcomings. Consider a situation where certain data was accessed many times in the past but hasn't been used recently. With LFU, this item would stay in cache due to its high historical access count, even though it's no longer needed.

This brings us to the Least Recently Used (LRU) policy, which focuses on recency rather than frequency. LRU tracks when each item was last accessed and removes the item that hasn't been accessed for the longest time. This policy works particularly well for workloads where items accessed recently are likely to be accessed again soon. LRU handles both the FIFO problem of evicting frequently used items and the LFU problem of holding onto items that were popular in the past but are no longer useful.

FEATURED COURSES

5 Courses of the Week

 Stanford’s Algorithms Specialization: Master fundamental algorithms through programming assignments and conceptual learning. This specialization prepares you to excel in technical interviews and communicate fluently about algorithms without focusing on low-level implementation details.

 Grokking the Modern System Design Interview: Get hands-on practice with over 100 data structures and algorithm exercises and guidance from a dedicated mentor to help prepare you for interviews and on-the-job scenarios.

 DeepLearning.AI Data Engineering Certificate: Build data pipelines on AWS, design data architectures, and learn batch/streaming processing. Hands-on labs with real-world tools.

 IBM’s Intro to Containers w/ Docker, Kubernetes & OpenShift: Learn to build and run applications using containers with Docker, and manage them at scale with Kubernetes and OpenShift. Get hands-on experience through browser-based labs.

 Firebase in a Weekend (Android): This course will teach you when and why to choose Firebase as a backend for your Android application.

NEWS

This Week in the Tech World

Apple Launches $599 MacBook Neo: Apple released its cheapest laptop ever, powered by an iPhone A18 Pro chip instead of the M-series. It's the first Mac to use an A-series processor, targeting students and budget buyers.

OpenAI Acquires Promptfoo: OpenAI is acquiring Promptfoo, an AI security platform used by over 25% of Fortune 500 companies. The open-source tool will be integrated into OpenAI Frontier, the company's enterprise platform for building AI agents.

LeCun's AMI Labs Raises $1B: Yann LeCun's new startup AMI Labs raised $1.03 billion in what is reportedly Europe's largest-ever seed round. The company is building "world models," an alternative AI approach to large language models.

Nvidia Invests in Thinking Machines: Nvidia made a significant investment in Mira Murati's Thinking Machines Lab and signed a multi-year compute deal. The startup will deploy at least one gigawatt of Nvidia's Vera Rubin systems starting in 2027.

Oracle Plans Thousands of Layoffs: Oracle is planning to cut thousands of jobs across multiple divisions to free up cash flow. The layoffs are driven by the financial strain of the company's massive AI data center expansion.

Trump Admin Tightens AI Vendor Rules: The Trump administration has drafted stricter rules requiring AI vendors seeking government contracts to permit "any lawful" use of their models. The move could reshape who wins federal AI work.

Samsung Pursues AI Partnerships: Samsung is partnering with OpenAI and Perplexity to bring more AI features to Galaxy devices. The company is betting on a multi-model approach as the smartphone race shifts from hardware specs to AI experience.

China Centers AI in National Strategy: China's latest five-year plan places AI at the core of its economic roadmap across manufacturing, healthcare, and education. Beijing is framing AI as a system-wide national capability, not a single sector.

BONUS

Just for laughs 😏

HELP US

👋 Hi there! We are on a mission to provide as much value as possible for free. If you want this newsletter to remain free, please help us grow by referring your friends:

📌 Share your referral link on LinkedIn or directly with your friends.
📌 Check your referrals status here.

YOUR FEEDBACK

What did you think of this week's email?

Your feedback helps us create better emails for you!

Login or Subscribe to participate in polls.

Until next time, take care! 🚀

Cheers,