1. Data Types & Mutability Beginner
| Type | Mutable? | Example |
|---|---|---|
| int, float, complex | No | 42, 3.14, 2+3j |
| bool | No | True, False |
| str | No | "hello" |
| tuple | No | (1, 2, 3) |
| frozenset | No | frozenset({1,2}) |
| list | Yes | [1, 2, 3] |
| dict | Yes | {"a": 1} |
| set | Yes | {1, 2, 3} |
type() to check, id() for identity, and is vs == for identity vs equality.2. Strings Beginner
"".join() instead.Immutable sequences of Unicode characters.
# Key methods
s = "Hello, World"
s.lower() # "hello, world"
s.split(",") # ["Hello", " World"]
s.strip() # remove whitespace
s.replace("H","J") # "Jello, World"
s.find("Wo") # 7 (-1 if not found)
f"Name: {name}" # f-string (3.6+)
"—".join(lst) # join list into string3. Lists vs Tuples Beginner
| Feature | List | Tuple |
|---|---|---|
| Syntax | [1, 2, 3] | (1, 2, 3) |
| Mutable | Yes | No |
| Hashable | No | Yes (if elements are) |
| Use as dict key | No | Yes |
| Performance | Slower | Faster (less memory) |
# Common list operations
a = [3, 1, 2]
a.append(4) # [3, 1, 2, 4]
a.sort() # [1, 2, 3, 4] in-place
sorted(a, reverse=True) # returns new list
a[1:3] # slicing: [2, 3]
a[::-1] # reverse: [4, 3, 2, 1]4. Dictionaries & Sets Beginner
collections.Counter and defaultdict — they're handy shortcuts.# Dict — ordered (3.7+), key-value pairs
d = {"a": 1, "b": 2}
d.get("c", 0) # 0 (default if missing)
d.keys() # dict_keys(["a","b"])
d.values() # dict_values([1, 2])
d.items() # dict_items([("a",1),...])
d | {"c": 3} # merge (3.9+)
# Set — unordered, unique elements
s = {1, 2, 3}
s & {2, 3, 4} # intersection: {2, 3}
s | {4} # union: {1, 2, 3, 4}
s - {1} # difference: {2, 3}defaultdict, Counter, and OrderedDict from collections are commonly tested favorites.5. Control Flow Beginner
for/else, the walrus operator, and match/case signals that you keep up with modern Python. The ternary expression and chained comparisons show concise thinking. You may be asked to simplify nested if/else logic using these features.# Ternary
x = "even" if n % 2 == 0 else "odd"
# Walrus operator (3.8+)
if (n := len(data)) > 10:
print(n)
# for/else — else runs if no break
for i in range(10):
if i == 5: break
else:
print("no break")
# match/case (3.10+)
match status:
case 200: print("OK")
case 404: print("Not Found")
case _: print("Other")6. Functions & Arguments Beginner
* and / separators, type hints, and first-class functions shows depth. You may be asked to write a function that accepts flexible arguments or explain how Python resolves argument order.# Positional, keyword, default
def greet(name, greeting="Hello"):
return f"{greeting}, {name}"
# Keyword-only (after *)
def func(a, b, *, key=None): ...
# Positional-only (before /, 3.8+)
def func(a, b, /, c=0): ...
# Type hints
def add(a: int, b: int) -> int:
return a + bdef f(x=[]) shares the same list across calls.7. Comprehensions Beginner
# List comprehension
squares = [x**2 for x in range(10) if x % 2 == 0]
# Dict comprehension
d = {k: v for k, v in zip(keys, vals)}
# Set comprehension
unique = {x.lower() for x in words}
# Generator expression (lazy)
total = sum(x**2 for x in range(10))8. OOP Fundamentals Intermediate
@classmethod, @staticmethod, and instance methods, and when to use @property. A common exercise: "Design a class hierarchy for [X]." Know the difference between class variables (shared) and instance variables (per-object).class Animal:
species_count = 0 # class variable
def __init__(self, name):
self.name = name # instance variable
Animal.species_count += 1
def speak(self): # instance method
raise NotImplementedError
@classmethod
def count(cls): # access class state
return cls.species_count
@staticmethod
def is_animal(obj): # no self/cls
return isinstance(obj, Animal)
@property
def info(self): # getter
return self.nameFour Pillars: Encapsulation, Abstraction, Inheritance, Polymorphism.
9. Inheritance & MRO Intermediate
super() follows the MRO, not just the direct parent. Also know: composition vs inheritance — a common follow-up is "when would you prefer composition?"class Dog(Animal):
def speak(self):
return "Woof!"
# Multiple inheritance
class C(A, B): pass
# MRO — C3 Linearization
C.__mro__ # or C.mro()
# (C, A, B, object)
# super() follows MRO, not just parent
class A:
def method(self):
super().method() # calls next in MROsuper() consistently to avoid issues.10. Magic / Dunder Methods Intermediate
len() and for loops", "Implement __eq__ and __hash__ correctly", or "What's the difference between __repr__ and __str__?" Knowing __call__ is a bonus — it connects to decorator and callable patterns.| Method | Purpose |
|---|---|
__init__ | Constructor |
__repr__ | Developer string repr |
__str__ | User-friendly string |
__len__ | len(obj) |
__getitem__ | obj[key] |
__iter__, __next__ | Make iterable |
__eq__, __lt__, ... | Comparison operators |
__add__, __mul__, ... | Arithmetic operators |
__enter__, __exit__ | Context manager |
__call__ | obj() — make callable |
__hash__ | hash(obj) for sets/dicts |
11. Decorators Intermediate
@lru_cache), rate limiting. Always mention @functools.wraps — forgetting it is a common red flag.import functools
# Basic decorator
def timer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
start = time.time()
result = func(*args, **kwargs)
print(f"Took {time.time()-start:.3f}s")
return result
return wrapper
# Decorator with arguments
def repeat(n):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
for _ in range(n):
func(*args, **kwargs)
return wrapper
return decorator
@repeat(3)
def hello(): print("Hi")@functools.wraps to preserve the wrapped function's metadata.12. Generators & Iterators Intermediate
yield from do?" Know that generators can only be iterated once.# Generator function (uses yield)
def fibonacci():
a, b = 0, 1
while True:
yield a
a, b = b, a + b
gen = fibonacci()
next(gen) # 0
next(gen) # 1
# yield from — delegate to sub-generator
def chain(*iterables):
for it in iterables:
yield from it
# Iterator protocol
class Counter:
def __init__(self, n): self.n = n; self.i = 0
def __iter__(self): return self
def __next__(self):
if self.i >= self.n: raise StopIteration
self.i += 1; return self.i13. Lambda, Map, Filter, Reduce Intermediate
reduce is a common follow-up.# Lambda — anonymous single-expression function
square = lambda x: x ** 2
# map — apply function to each element
list(map(lambda x: x*2, [1,2,3])) # [2, 4, 6]
# filter — keep elements where fn is True
list(filter(lambda x: x>0, [-1,2,-3,4])) # [2, 4]
# reduce — accumulate
from functools import reduce
reduce(lambda a, b: a+b, [1,2,3,4]) # 1014. Exception Handling Intermediate
except Exception and bare except?", "When does else run?", "Why should you avoid catching broad exceptions?" Know how to create custom exceptions and the EAFP ("Easier to Ask Forgiveness than Permission") vs LBYL pattern.try:
result = 10 / 0
except ZeroDivisionError as e:
print(e)
except (TypeError, ValueError):
print("Type or Value error")
else:
print("No error") # runs if no exception
finally:
print("Always runs") # cleanup
# Custom exception
class MyError(Exception):
def __init__(self, msg, code):
super().__init__(msg)
self.code = code15. Scope & Closures Intermediate
nonlocal do vs global?" Also beware the late-binding closure gotcha in loops — it's a favorite trick question where all lambdas return the same value.# LEGB Rule: Local → Enclosing → Global → Built-in
x = "global"
def outer():
x = "enclosing"
def inner():
nonlocal x # modify enclosing
x = "inner"
inner()
print(x) # "inner"
# Closure — inner fn remembers enclosing scope
def make_multiplier(n):
def multiply(x):
return x * n # n is "closed over"
return multiply
triple = make_multiplier(3)
triple(5) # 1516. Shallow vs Deep Copy Intermediate
.copy() / slicing creates a shallow copy (new outer, shared inner), and copy.deepcopy() creates a fully independent clone. This directly impacts debugging and data integrity in production code.import copy
a = [[1, 2], [3, 4]]
b = a # reference (same object)
c = a.copy() # shallow: new list, same inner lists
d = copy.deepcopy(a) # deep: fully independent copy
a[0].append(99)
print(b[0]) # [1, 2, 99] — same object
print(c[0]) # [1, 2, 99] — shallow: inner shared
print(d[0]) # [1, 2] — deep: independent17. *args & **kwargs Intermediate
*args and **kwargs?", "In what order must they appear in a function signature?", and "How do you use * and ** for unpacking?" The answer: positional first, then *args, then keyword-only, then **kwargs.def func(*args, **kwargs):
# args = tuple of positional args
# kwargs = dict of keyword args
print(args, kwargs)
func(1, 2, x=3) # (1, 2) {'x': 3}
# Unpacking
def add(a, b, c): return a + b + c
nums = [1, 2, 3]
add(*nums) # 6
d = {"a":1, "b":2, "c":3}
add(**d) # 618. File I/O Intermediate
with statements (context managers) to guarantee files are closed even if exceptions occur. Know the difference between read(), readline(), and iterating line-by-line (memory efficient). Bonus: mention pathlib as the modern alternative to os.path — it shows you write modern Python.# Always use context manager
with open("file.txt", "r") as f:
content = f.read() # entire file
lines = f.readlines() # list of lines
for line in f: # lazy line-by-line
process(line)
# Modes: r, w, a, rb, wb, r+
with open("out.txt", "w") as f:
f.write("Hello\n")
# pathlib (modern approach)
from pathlib import Path
text = Path("file.txt").read_text()19. Context Managers Advanced
__enter__/__exit__) and the simpler @contextmanager decorator. Key follow-up: "What happens if an exception occurs inside a with block?" — __exit__ still runs, and returning True suppresses the exception.# Class-based
class Timer:
def __enter__(self):
self.start = time.time()
return self
def __exit__(self, exc_type, exc_val, tb):
print(f"Elapsed: {time.time()-self.start:.3f}")
return False # don't suppress exceptions
# Generator-based (simpler)
from contextlib import contextmanager
@contextmanager
def managed_resource():
r = acquire()
try:
yield r
finally:
release(r)20. Concurrency & the GIL Advanced
threading for I/O-bound work (network, disk), multiprocessing for CPU-bound work (computation), and asyncio for high-concurrency I/O (thousands of connections). Know that Python 3.13+ has experimental free-threading (no-GIL) builds.| Approach | Best For | GIL? |
|---|---|---|
threading | I/O-bound tasks | Limited by GIL |
multiprocessing | CPU-bound tasks | Bypasses GIL |
asyncio | High-concurrency I/O | Single thread |
# asyncio
import asyncio
async def fetch(url):
await asyncio.sleep(1)
return f"Done: {url}"
async def main():
results = await asyncio.gather(
fetch("url1"), fetch("url2")
)
asyncio.run(main())
# Threading
from concurrent.futures import ThreadPoolExecutor
with ThreadPoolExecutor(max_workers=4) as pool:
results = pool.map(process, items)21. Memory Management & Garbage Collection Advanced
del doesn't delete objects — it decrements the refcount. Bonus topics: memory pools, integer/string interning, and why is behaves unexpectedly with small integers (-5 to 256).Reference counting is the primary mechanism — objects are freed when refcount hits 0. A cyclic garbage collector handles reference cycles (gen 0, 1, 2).
import sys, gc
sys.getrefcount(obj) # reference count (+1 for arg)
gc.collect() # force garbage collection
gc.get_threshold() # (700, 10, 10) default
# Weak references — don't increase refcount
import weakref
ref = weakref.ref(obj)is may behave unexpectedly with them.22. Metaclasses Advanced
type is the default metaclass). Used in frameworks like Django ORM and SQLAlchemy for declarative APIs. Know the class creation flow: __new__ → __init__ → __call__. If asked for a practical example, Singleton is the go-to pattern. The key is understanding the concept, even if you don't use it daily.# Metaclass — a class whose instances are classes
# type is the default metaclass
class SingletonMeta(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super().__call__(*args, **kwargs)
return cls._instances[cls]
class DB(metaclass=SingletonMeta):
pass
# DB() is DB() → True (same instance)__new__ (create) → __init__ (initialize) → __call__ (on instantiation).23. Descriptors & __slots__ Advanced
@property, @classmethod, and @staticmethod under the hood. Understanding them shows you know Python's attribute lookup mechanism. __slots__ is a performance optimization question: "How would you reduce memory usage of a class with millions of instances?" The answer: define __slots__ to eliminate the per-instance __dict__ overhead.# Descriptor protocol: __get__, __set__, __delete__
class Validated:
def __set_name__(self, owner, name):
self.name = name
def __set__(self, obj, value):
if value < 0: raise ValueError("Must be >= 0")
obj.__dict__[self.name] = value
def __get__(self, obj, objtype=None):
return obj.__dict__.get(self.name, 0)
# __slots__ — restrict attributes, save memory
class Point:
__slots__ = ("x", "y")
def __init__(self, x, y):
self.x = x; self.y = y
# No __dict__ → less memory, faster attribute access24. Abstract Base Classes Advanced
ABC with @abstractmethod. Unlike Java interfaces, Python ABCs can have concrete methods too. A related question: "What's duck typing and how does it relate to ABCs?" Python prefers duck typing, but ABCs add safety when you need guaranteed contracts.from abc import ABC, abstractmethod
class Shape(ABC):
@abstractmethod
def area(self) -> float: ...
@abstractmethod
def perimeter(self) -> float: ...
class Circle(Shape):
def __init__(self, r): self.r = r
def area(self): return 3.14159 * self.r**2
def perimeter(self): return 2 * 3.14159 * self.r
# Shape() → TypeError (can't instantiate abstract class)25. Design Patterns Advanced
# Singleton (module-level is simplest in Python)
# Just put state in a module — modules are singletons
# Factory
class AnimalFactory:
@staticmethod
def create(kind):
return {"dog": Dog, "cat": Cat}[kind]()
# Observer (simplified)
class EventEmitter:
def __init__(self):
self._listeners = {}
def on(self, event, fn):
self._listeners.setdefault(event, []).append(fn)
def emit(self, event, *args):
for fn in self._listeners.get(event, []):
fn(*args)
# Strategy — pass behavior as function/callable
def process(data, strategy):
return strategy(data)26. Common Gotchas & Tips Intermediate
is vs ==, and late-binding closures appear constantly. Knowing these gotchas shows real-world Python experience. Pro tip: if you can explain why the gotcha exists (not just what it does), you'll stand out from other candidates.| Gotcha | Explanation |
|---|---|
| Mutable default args | def f(x=[]) shares list across calls. Use None instead. |
is vs == | is checks identity, == checks equality. Use == for values. |
| Late binding closures | Lambdas in loops capture variable reference, not value. Fix: lambda x, i=i: ... |
import circular | A imports B, B imports A. Fix: restructure or import inside function. |
a += b vs a = a + b | For mutables, += mutates in-place; + creates new object. |
| Chained comparison | 1 < x < 10 works! It's 1 < x and x < 10. |
| Truthy / Falsy | 0, "", [], {}, None, False are falsy. Everything else is truthy. |
__name__ == "__main__" | Guard to run code only when script is executed directly, not imported. |
enumerate, zip, any, all, isinstance, getattr, hasattr, vars, dir.27. Type Hints & the typing Module Intermediate
mypy, IDE autocompletion), and catching bugs before runtime. Know the difference between runtime behavior (hints are not enforced) and static analysis. Since 3.10+, built-in types work directly as generics (list[int] instead of List[int]).# Basic type hints
def greet(name: str, times: int = 1) -> str:
return name * times
# Built-in generics (3.9+)
scores: list[int] = [90, 85, 92]
mapping: dict[str, float] = {"pi": 3.14}
# Union & Optional
from typing import Optional
def find(key: str) -> int | None: # 3.10+ union syntax
...
def find(key: str) -> Optional[int]: # equivalent, older style
...
# TypeVar & Generics
from typing import TypeVar
T = TypeVar("T")
def first(items: list[T]) -> T:
return items[0]
# Callable & Protocol
from typing import Callable, Protocol
Transformer = Callable[[str], str] # fn(str) -> str
class Drawable(Protocol): # structural subtyping
def draw(self) -> None: ...mypy or pyright for static checking. Since 3.12, use type statement for aliases: type Vector = list[float].28. Dataclasses & NamedTuples Intermediate
__init__, __repr__, __eq__, and more. Key questions: "When would you use a dataclass vs a dict vs a NamedTuple?" The answer depends on mutability, type safety, and whether you need methods. frozen=True makes instances immutable and hashable.from dataclasses import dataclass, field
@dataclass
class User:
name: str
age: int
tags: list[str] = field(default_factory=list)
u = User("Alice", 30)
print(u) # User(name='Alice', age=30, tags=[])
# Frozen (immutable + hashable)
@dataclass(frozen=True)
class Point:
x: float
y: float
# __post_init__ for derived fields
@dataclass
class Rect:
w: float
h: float
area: float = field(init=False)
def __post_init__(self):
self.area = self.w * self.h
# NamedTuple — immutable, lightweight
from typing import NamedTuple
class Coord(NamedTuple):
x: float
y: float| Feature | dataclass | NamedTuple | dict |
|---|---|---|---|
| Mutable | Yes (default) | No | Yes |
| Type-safe | Yes (with mypy) | Yes | No |
| Hashable | If frozen | Yes | No |
| Methods | Yes | Limited | No |
29. Modules, Packages & Imports Beginner
.py file, a package is a directory with __init__.py. Know the difference between absolute and relative imports, how sys.path is resolved, and why circular imports happen (and how to fix them).# Absolute import
from mypackage.utils import helper
# Relative import (inside a package)
from . import sibling_module
from ..utils import helper
# Package structure
# mypackage/
# __init__.py ← makes it a package
# core.py
# utils/
# __init__.py
# helper.py
# sys.path — where Python looks for modules
import sys
print(sys.path) # ['', '/usr/lib/python3/...']
# Lazy / conditional import
def process():
import heavy_lib # only imported when called
# __all__ controls "from module import *"
__all__ = ["public_fn", "PublicClass"]
# Reload (debugging only)
from importlib import reload
reload(my_module)30. Regular Expressions Intermediate
re.match() (start of string), re.search() (anywhere), and re.findall() (all matches). Groups, quantifiers, and character classes come up in string processing challenges. Always use raw strings (r"...") for patterns.import re
# Core functions
re.search(r"\d+", "age: 25") #
re.match(r"\d+", "25 apples") # matches at start
re.findall(r"\d+", "3 cats, 5 dogs") # ['3', '5']
re.sub(r"\s+", "-", "a b c") # "a-b-c"
re.split(r"[,;]", "a,b;c") # ['a', 'b', 'c']
# Groups
m = re.search(r"(\w+)@(\w+)\.(\w+)", "a@b.com")
m.group(1) # 'a'
m.groups() # ('a', 'b', 'com')
# Named groups
m = re.search(r"(?P\w+)@(?P\w+)" , "a@b.com")
m.group("user") # 'a'
# Compile for reuse
pattern = re.compile(r"^\d{3}-\d{4}$")
pattern.match("123-4567") # efficient for repeated use| Pattern | Meaning |
|---|---|
\d, \w, \s | Digit, word char, whitespace |
. | Any char (except newline) |
*, +, ? | 0+, 1+, 0 or 1 |
{n,m} | Between n and m times |
^, $ | Start, end of string |
[abc], [^abc] | Char class, negated class |
(?:...) | Non-capturing group |
(?=...), (?!...) | Lookahead, negative lookahead |
31. Testing with pytest Intermediate
pytest is the de facto standard — simpler than unittest with powerful features like fixtures, parametrize, and auto-discovery. Know how to write basic assertions, use fixtures for setup/teardown, mock external dependencies, and structure test files.# test_math.py — pytest auto-discovers test_*.py
def test_addition():
assert 1 + 1 == 2
def test_exception():
import pytest
with pytest.raises(ZeroDivisionError):
1 / 0
# Fixtures — reusable setup
import pytest
@pytest.fixture
def sample_list():
return [1, 2, 3]
def test_length(sample_list):
assert len(sample_list) == 3
# Parametrize — test multiple inputs
@pytest.mark.parametrize("inp, expected", [
(2, 4), (3, 9), (-1, 1)
])
def test_square(inp, expected):
assert inp ** 2 == expected
# Mocking
from unittest.mock import patch, MagicMock
@patch("mymodule.requests.get")
def test_api(mock_get):
mock_get.return_value.json.return_value = {"ok": True}
result = fetch_data()
assert result["ok"] is Truepytest -v (verbose), pytest -x (stop on first failure), pytest --cov (coverage report).Top Interview Questions & Answers Must Know
Fundamentals
Q1. What are the key differences between Python 2 and Python 3?
▼Python 3 is the modern, actively maintained version; Python 2 reached end-of-life in January 2020. Key differences:
- print:
print x(statement, Py2) vsprint(x)(function, Py3). - Division:
5/2returns2in Py2 (integer div) and2.5in Py3 (true div); use//for floor division in Py3. - Strings: Py3 strings are Unicode by default (
stris text,bytesis binary). Py2 hadstras bytes andunicodeas text. - xrange/range: Py3's
rangeis a lazy iterator (like Py2'sxrange);xrangeis gone. - Exception syntax:
except Exception as e(Py3); Py2 also allowedexcept Exception, e. - Type hints, f-strings, async/await, walrus operator, dataclasses: Py3-only features.
Q2. Explain mutable vs immutable types. Why does it matter?
▼Immutable objects cannot change after creation — any "modification" creates a new object. Examples: int, float, bool, str, tuple, frozenset, bytes.
Mutable objects can be modified in place. Examples: list, dict, set, bytearray, and most custom classes.
Why it matters:
- Only immutable (hashable) types can be dictionary keys or set elements.
- Mutable default arguments cause a classic bug: the default is shared across all calls.
- Passing a mutable object to a function lets the function modify your original data.
# Mutable default argument pitfall
def append_to(item, target=[]): # BAD — list is shared
target.append(item)
return target
append_to(1) # [1]
append_to(2) # [1, 2] — surprise!
# Correct pattern
def append_to(item, target=None):
if target is None:
target = []
target.append(item)
return targetQ3. What is the difference between is and ==?
▼
== compares values (calls __eq__). is compares identity — whether two names refer to the exact same object in memory (same id()).
a = [1, 2, 3]
b = [1, 2, 3]
a == b # True — same values
a is b # False — different objects
x = None
x is None # Correct idiom for None checksAlways use is for comparing to None, True, and False (these are singletons). Use == for value equality.
-5 to 256) and short interned strings are cached, so a is b may return True for small ints — this is a CPython implementation detail, never rely on it.Q4. List vs tuple — when would you use each?
▼List: mutable, variable length, used when contents will change. Slight memory overhead and slower than tuples.
Tuple: immutable, fixed length, slightly faster, hashable (if elements are hashable), can be a dict key.
- Use a list for homogeneous, changing collections:
scores = [90, 85, 70]. - Use a tuple for fixed records / heterogeneous data:
point = (x, y),rgb = (255, 0, 0). - Tuples are often returned when a function needs to return multiple values:
return name, age.
Q5. How are Python dictionaries implemented, and what is their time complexity?
▼Python dicts are implemented as open-addressed hash tables. Since Python 3.7 they also preserve insertion order (an implementation guarantee, not just a side effect).
Time complexity (average / amortized):
d[k],k in d,d[k] = v,del d[k]— all O(1) average.- Worst case is O(n) if all keys hash to the same bucket (rare; Python randomizes hash seeds).
Keys must be hashable (implement __hash__ and __eq__ consistently). That's why lists and dicts can't be keys — they're mutable and unhashable.
Since Python 3.6, CPython uses a compact dict layout that reduces memory by ~20–25% and preserves order.
Q6. What is the difference between *args and **kwargs?
▼
*args collects extra positional arguments into a tuple. **kwargs collects extra keyword arguments into a dict.
def demo(a, *args, **kwargs):
print(a, args, kwargs)
demo(1, 2, 3, x=10, y=20)
# 1 (2, 3) {'x': 10, 'y': 20}
# Unpacking on the call side
nums = [1, 2, 3]
opts = {"x": 10}
demo(*nums, **opts)The names args / kwargs are convention — the * and ** are what matters. Common uses: writing decorators that forward arguments, and building flexible APIs.
Q7. What is the LEGB rule?
▼LEGB is the order Python searches for a name: Local → Enclosing → Global → Built-in.
- Local: names defined inside the current function.
- Enclosing: names in any outer enclosing function (for nested functions / closures).
- Global: names at the top level of the module.
- Built-in: names pre-loaded by Python (
len,print,range...).
To modify a name in an outer scope, use global x or nonlocal x.
x = "global"
def outer():
x = "enclosing"
def inner():
nonlocal x
x = "modified"
inner()
print(x) # modifiedOOP & Classes
Q8. Explain the four pillars of OOP in Python.
▼- Encapsulation: bundling data and methods, with name mangling (
_protected,__private) for weak access control. Python relies on convention, not enforcement. - Inheritance:
class Dog(Animal)letsDogreuse and extendAnimal. Python supports multiple inheritance, resolved via MRO (C3 linearization). - Polymorphism: same method name behaves differently across classes. Python embraces duck typing: "if it walks like a duck..." — you don't need a common base class, just a common interface.
- Abstraction: hiding implementation details behind a clean interface, often using
abc.ABCand@abstractmethodto define contracts.
Q9. @classmethod vs @staticmethod vs instance method — when to use each?
▼
- Instance method — first arg is
self. Accesses and modifies instance state. @classmethod— first arg iscls. Operates on the class itself. Common use: alternative constructors.@staticmethod— no automatic first argument. Just a regular function namespaced inside the class. Use when logic is conceptually related but doesn't needselforcls.
class Pizza:
def __init__(self, toppings):
self.toppings = toppings
@classmethod
def margherita(cls): # alternative constructor
return cls(["tomato", "mozzarella"])
@staticmethod
def is_valid_topping(t): # utility, no state needed
return t in {"cheese", "tomato", "basil"}Q10. What is MRO and how does Python resolve multiple inheritance?
▼MRO (Method Resolution Order) is the order Python searches base classes when looking up an attribute or method. Python uses the C3 linearization algorithm, which guarantees:
- A class appears before its parents.
- Parents appear in the order they're listed.
- Monotonicity — the order is consistent throughout the hierarchy.
class A: pass
class B(A): pass
class C(A): pass
class D(B, C): pass
print(D.__mro__)
# (D, B, C, A, object) — the classic "diamond" resolved cleanlysuper() follows the MRO, not just the direct parent — which is why cooperative multiple inheritance works in Python.
Q11. What are dunder (magic) methods? Name the most important ones.
▼Dunder methods (double underscore, e.g. __init__) let you integrate your class with Python's built-in syntax and protocols. The important ones:
__init__,__new__— construction.__repr__,__str__— debug and user-facing representations.__eq__,__hash__,__lt__,__gt__— equality and ordering.__len__,__iter__,__next__,__contains__,__getitem__— container / iteration protocols.__enter__,__exit__— context manager protocol.__call__— make instances callable like functions.__add__,__mul__,__sub__, ... — operator overloading.
__eq__, always define __hash__ too — or set it to None to make the object unhashable.Q12. What is @property and why use it?
▼
@property turns a method into a read-only attribute, letting you add validation or computed values without changing the attribute interface callers use.
class Circle:
def __init__(self, radius):
self._radius = radius
@property
def radius(self):
return self._radius
@radius.setter
def radius(self, value):
if value < 0:
raise ValueError("radius must be non-negative")
self._radius = value
@property
def area(self):
return 3.14159 * self._radius ** 2
c = Circle(5)
print(c.area) # accessed like an attribute, computed lazilyQ13. Difference between __init__ and __new__?
▼
__new__ is a static method that creates and returns a new instance. __init__ initializes the instance after it's been created.
__new__(cls, ...)runs first. It must return an instance (usually by callingsuper().__new__(cls)).__init__(self, ...)runs next, only if__new__returned an instance ofcls.
You rarely override __new__. Real use cases: implementing singletons, subclassing immutable types (int, str, tuple), or metaclasses.
Advanced Concepts
Q14. How do decorators work? Write one from scratch.
▼A decorator is a callable that takes a function (or class) and returns a new callable, typically wrapping the original. @decorator is just syntactic sugar for func = decorator(func).
import functools, time
def timer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f}s")
return result
return wrapper
@timer
def slow():
time.sleep(0.5)
slow() # slow took 0.5012sfunctools.wraps is essential — it copies __name__, __doc__, and signature from the original, so introspection and debugging still work.
Q15. What is a generator and when should you use one?
▼A generator is a function that uses yield instead of (or in addition to) return. It produces values lazily, one at a time, maintaining state between calls. Memory-efficient for large or infinite sequences.
def fib(n):
a, b = 0, 1
for _ in range(n):
yield a
a, b = b, a + b
list(fib(6)) # [0, 1, 1, 2, 3, 5]
# Generator expression — like a list comp, but lazy
squared_sum = sum(x*x for x in range(10**6))Use generators when: the sequence is huge, infinite, streaming from I/O, or when you only need to iterate once. They trade random-access for O(1) memory.
Q16. What is a context manager? How do you implement one?
▼A context manager is an object usable with with that guarantees setup and cleanup. The protocol is __enter__ and __exit__. Common use: managing resources like files, locks, DB connections.
# Class-based
class Timer:
def __enter__(self):
self.start = time.perf_counter()
return self
def __exit__(self, exc_type, exc_val, tb):
print(f"Elapsed: {time.perf_counter() - self.start:.3f}s")
with Timer():
expensive_work()
# Decorator-based — simpler for one-off managers
from contextlib import contextmanager
@contextmanager
def timer():
start = time.perf_counter()
yield
print(f"Elapsed: {time.perf_counter()-start:.3f}s")__exit__ receives the exception info; return True to suppress it, anything else (or None) to let it propagate.
Q17. What is a metaclass?
▼A metaclass is "the class of a class." In Python, classes are themselves objects, and their type is a metaclass — by default, type. Metaclasses let you customize class creation (add methods, validate, register subclasses).
class UpperMeta(type):
def __new__(mcs, name, bases, ns):
ns = {k.upper() if not k.startswith("__") else k: v
for k, v in ns.items()}
return super().__new__(mcs, name, bases, ns)
class Foo(metaclass=UpperMeta):
def bar(self): return 42
Foo().BAR() # 42 — 'bar' was uppercased at class creation__init_subclass__ for 99% of cases.Q18. What are Python's iterators and the iterator protocol?
▼An iterable is any object that implements __iter__ and returns an iterator. An iterator implements __iter__ (returning self) and __next__, which returns the next value or raises StopIteration.
class CountDown:
def __init__(self, start):
self.current = start
def __iter__(self):
return self
def __next__(self):
if self.current <= 0:
raise StopIteration
self.current -= 1
return self.current + 1
list(CountDown(3)) # [3, 2, 1]A for loop calls iter() on the iterable and then next() until StopIteration.
Q19. What are dataclasses and why use them?
▼@dataclass auto-generates __init__, __repr__, and __eq__ from type-annotated class attributes. Added in Python 3.7. Reduces boilerplate for simple data containers.
from dataclasses import dataclass, field
@dataclass
class User:
name: str
age: int = 0
tags: list = field(default_factory=list)
u = User("Ada", 36)
print(u) # User(name='Ada', age=36, tags=[])Useful options: @dataclass(frozen=True) makes instances immutable and hashable; @dataclass(slots=True) (3.10+) adds __slots__ for memory savings.
Concurrency & Memory
Q20. What is the GIL and how does it affect Python programs?
▼The Global Interpreter Lock (GIL) is a mutex in CPython that ensures only one thread executes Python bytecode at a time. It simplifies memory management and C extensions but prevents true parallelism of CPU-bound Python code across threads.
Implications:
- I/O-bound workloads (network, disk) benefit from threads — the GIL is released during blocking I/O.
- CPU-bound workloads don't scale with threads. Use
multiprocessing,concurrent.futures.ProcessPoolExecutor, or C extensions (NumPy, PyTorch) that release the GIL. asynciois an alternative for high-concurrency I/O with a single thread.
Q21. Threading vs multiprocessing vs asyncio — which do I pick?
▼| Approach | Best for | Why |
|---|---|---|
| threading | I/O-bound, blocking libraries | Lightweight; GIL released on I/O. Hundreds of threads OK. |
| multiprocessing | CPU-bound work | Each process has its own interpreter and GIL; true parallelism. Higher memory/IPC cost. |
| asyncio | Many concurrent I/O tasks (10k+ sockets) | Single thread, cooperative scheduling. Requires async-aware libraries. |
Quick rule: CPU-heavy → processes. Blocking I/O and small scale → threads. High-scale network I/O with async libraries → asyncio.
Q22. How does Python manage memory and garbage collection?
▼CPython uses two mechanisms:
- Reference counting: every object tracks how many references point to it. When the count hits zero, the object is freed immediately. Fast and deterministic.
- Generational garbage collector (
gcmodule): detects and collects reference cycles (e.g., two objects referring to each other) that ref counting can't. Objects are grouped into 3 generations; newer generations are collected more frequently.
You can inspect and control it with gc.collect(), gc.disable(), sys.getrefcount(obj). Use weakref to avoid creating cycles in caches and observers.
Q23. What is __slots__ and when should you use it?
▼
By default, Python stores instance attributes in a per-instance __dict__. Defining __slots__ tells Python to use a fixed-size array instead, which:
- Reduces memory (~40–50%) — important when creating millions of instances.
- Speeds up attribute access slightly.
- Prevents adding attributes not listed in
__slots__.
class Point:
__slots__ = ("x", "y")
def __init__(self, x, y):
self.x, self.y = x, y
p = Point(1, 2)
p.z = 3 # AttributeError — slot 'z' not definedTrade-offs: no dynamic attributes, and multiple inheritance with slots is tricky. Use only when profiling shows memory pressure from many instances.
Coding & Gotchas
Q24. Shallow vs deep copy — show the difference.
▼A shallow copy creates a new outer container but shares references to the inner objects. A deep copy recursively copies everything.
import copy
original = [[1, 2], [3, 4]]
shallow = copy.copy(original)
deep = copy.deepcopy(original)
shallow[0].append(99)
print(original) # [[1, 2, 99], [3, 4]] — shared inner list!
print(deep) # [[1, 2], [3, 4]] — fully independentShortcuts for shallow copies: list(x), x[:], x.copy(), {**d}.
Q25. How do you reverse a string? Remove duplicates from a list preserving order?
▼# Reverse a string — slicing is the idiomatic one-liner
s = "hello"
s[::-1] # 'olleh'
"".join(reversed(s)) # also works
# Dedup preserving order — dict keeps insertion order (Py3.7+)
xs = [3, 1, 2, 3, 1, 4]
list(dict.fromkeys(xs)) # [3, 1, 2, 4]
# Dedup (order not needed)
list(set(xs)) # order-unstable
# Palindrome
def is_palindrome(s):
s = s.lower()
return s == s[::-1]
# Flatten one level
nested = [[1, 2], [3, 4], [5]]
flat = [x for row in nested for x in row]Q26. Explain the mutable default argument gotcha.
▼Default argument values are evaluated once when the function is defined, not every call. If the default is a mutable object, all calls share it.
def bad(x, acc=[]):
acc.append(x)
return acc
bad(1) # [1]
bad(2) # [1, 2] ← leaked state
def good(x, acc=None):
if acc is None:
acc = []
acc.append(x)
return accAlways use None as the sentinel and create a fresh object inside the function.
Q27. Explain the late-binding closure pitfall in loops.
▼Closures capture variables by reference, not by value. When the closure runs, it looks up the current value of the variable — often leading to surprises in loops.
funcs = [lambda: i for i in range(3)]
[f() for f in funcs] # [2, 2, 2] — all see final i
# Fix: bind via a default argument
funcs = [lambda i=i: i for i in range(3)]
[f() for f in funcs] # [0, 1, 2]Q28. What does if __name__ == "__main__": do?
▼
When Python runs a file directly, it sets __name__ to "__main__". When the file is imported as a module, __name__ is set to the module's name. The if guard lets the same file behave as both a script and an importable module:
def main():
print("Running as a script")
if __name__ == "__main__":
main()Essential when using multiprocessing on Windows/macOS — without it, spawned subprocesses re-execute the entire module and hit an infinite recursion.
Q29. What's the difference between append, extend, and += on a list?
▼
a = [1, 2]
a.append([3, 4]) # [1, 2, [3, 4]] — adds ONE element (the list)
a = [1, 2]
a.extend([3, 4]) # [1, 2, 3, 4] — adds each element
a = [1, 2]
a += [3, 4] # [1, 2, 3, 4] — same as extend for lists+= on a list mutates in place (like extend); += on a tuple rebinds to a new tuple.
Q30. How would you read a huge file line by line without loading it into memory?
▼Iterate directly over the file object — it yields one line at a time, O(1) memory regardless of file size.
with open("huge.log") as f:
for line in f:
process(line.rstrip())
# Count matching lines — also streaming
with open("huge.log") as f:
errors = sum(1 for line in f if "ERROR" in line)Avoid f.readlines() or f.read() on large files — they load everything at once.
Q31. What's the difference between zip, enumerate, and map?
▼
# enumerate — index + value
for i, name in enumerate(["a", "b", "c"], start=1):
print(i, name)
# zip — pair parallel iterables (stops at shortest)
names = ["Alice", "Bob"]
ages = [30, 25]
for n, a in zip(names, ages):
print(n, a)
# map — apply function to each element (lazy iterator)
squares = list(map(lambda x: x*x, [1, 2, 3])) # [1, 4, 9]All three return iterators in Python 3. Wrap in list() to materialize. For map, a list comprehension is usually clearer: [x*x for x in xs].
Q32. Why is 0.1 + 0.2 != 0.3 in Python?
▼
Python's float is IEEE 754 double-precision binary. Decimals like 0.1 can't be represented exactly in binary, so tiny rounding errors accumulate.
0.1 + 0.2 # 0.30000000000000004
# Safe equality for floats
import math
math.isclose(0.1 + 0.2, 0.3) # True
# Need exact decimals? Use Decimal
from decimal import Decimal
Decimal("0.1") + Decimal("0.2") # Decimal('0.3')Use decimal.Decimal for money and math.isclose for approximate float comparisons.
Q33. How do you merge two dictionaries?
▼a = {"x": 1, "y": 2}
b = {"y": 20, "z": 3}
# Python 3.9+ — merge operator
merged = a | b # {'x': 1, 'y': 20, 'z': 3}
# Any Python 3.5+
merged = {**a, **b}
# In-place update
a.update(b)On conflicting keys, the right-hand dict wins. Use collections.ChainMap if you want a view that layers dicts without copying.
Q34. How do you handle exceptions well? try/except/else/finally.
▼
try:
data = fetch()
except ConnectionError as e:
log("retrying", e)
data = fetch_from_cache()
except ValueError:
raise # bubble up
else:
log("success") # runs only if no exception
finally:
cleanup() # always runsBest practices:
- Catch specific exceptions — never bare
except:(it hidesKeyboardInterrupt,SystemExit). - Use
raise(no argument) to re-raise while preserving traceback. - Chain with
raise NewError("context") from err. - Prefer EAFP ("easier to ask forgiveness than permission") — try the operation and catch, rather than checking conditions up front.
Q35. What Python anti-patterns should I avoid?
▼- Bare
except:— catches everything, includingSystemExit. Use specific exceptions. - Mutable default arguments (see Q26).
for i in range(len(xs)): xs[i]— usefor x in xsorenumerate.- String concatenation in a loop (
s += x) — use"".join(parts). if x == True:— useif x:.type(x) == MyClass— useisinstance(x, MyClass)to respect inheritance.- Using
==to compare withNone— useis None. - Catching and silently ignoring exceptions (
except: pass) — at least log them. - Overusing classes for what a function would do — not everything needs to be OOP.