The iron ML notebook
  • The iron data science notebook
  • ML & Data Science
    • Frequent Questions
      • Discriminative vs Generative models
      • Supervised vs Unsupervised learning
      • Batch vs Online Learning
      • Instance-based vs Model-based Learning
      • Bias-Variance Tradeoff
      • Probability vs Likelihood
      • Covariance vs Correlation Matrix
      • Precision vs Recall
      • How does a ROC curve work?
      • Ridge vs Lasso
      • Anomaly detection methods
      • How to deal with imbalanced datasets?
      • What is "Statistically Significant"?
      • Recommendation systems methods
    • Statistics
      • The basics
      • Distributions
      • Sampling
      • IQR
      • Z-score
      • F-statistic
      • Outliers
      • The bayesian basis
      • Statistic vs Parameter
      • Markov Monte Carlo Chain
    • ML Techniques
      • Pre-process
        • PCA
      • Loss functions
      • Regularization
      • Optimization
      • Metrics
        • Distance measures
      • Activation Functions
      • Selection functions
      • Feature Normalization
      • Cross-validation
      • Hyperparameter tuning
      • Ensemble methods
      • Hard negative mining
      • ML Serving
        • Quantization
        • Kernel Auto-Tuning
        • NVIDIA TensorRT vs ONNX Runtime
    • Machine Learning Algorithms
      • Supervised Learning
        • Support Vector Machines
        • Adaptative boosting
        • Gradient boosting
        • Regression algorithms
          • Linear Regression
          • Lasso regression
          • Multi Layer Perceptron
        • Classification algorithms
          • Perceptron
          • Logistic Regression
          • Multilayer Perceptron
          • kNN
          • Naive Bayes
          • Decision Trees
          • Random Forest
          • Gradient Boosted Trees
      • Unsupervised learning
        • Clustering
          • Clustering metrics
          • kMeans
          • Gaussian Mixture Model
          • Hierarchical clustering
          • DBSCAN
      • Cameras
        • Intrinsic and extrinsic parameters
    • Computer Vision
      • Object Detection
        • Two-Stage detectors
          • Traditional Detection Models
          • R-CNN
          • Fast R-CNN
          • Faster R-CNN
        • One-Stage detectors
          • YOLO
          • YOLO v2
          • YOLO v3
          • YOLOX
        • Techniques
          • NMS
          • ROI Pooling
        • Metrics
          • Objectness Score
          • Coco Metrics
          • IoU
      • MOT
        • SORT
        • Deep SORT
  • Related Topics
    • Intro
    • Python
      • Global Interpreter Lock (GIL)
      • Mutability
      • AsyncIO
    • SQL
    • Combinatorics
    • Data Engineering Questions
    • Distributed computation
      • About threads & processes
      • REST vs gRPC
  • Algorithms & data structures
    • Array
      • Online Stock Span
      • Two Sum
      • Best time to by and sell stock
      • Rank word combination
      • Largest subarray with zero sum
    • Binary
      • Sum of Two Integers
    • Tree
      • Maximum Depth of Binary Tree
      • Same Tree
      • Invert/Flip Binary Tree
      • Binary Tree Paths
      • Binary Tree Maximum Path Sum
    • Matrix
      • Set Matrix Zeroes
    • Linked List
      • Reverse Linked List
      • Detect Cycle
      • Merge Two Sorted Lists
      • Merge k Sorted Lists
    • String
      • Longest Substring Without Repeating Characters
      • Longest Repeating Character Replacement
      • Minimum Window Substring
    • Interval
    • Graph
    • Heap
    • Dynamic Programming
      • Fibonacci
      • Grid Traveler
      • Can Sum
      • How Sum
      • Best Sum
      • Can Construct
      • Count Construct
      • All Construct
      • Climbing Stairs
Powered by GitBook
On this page
  • All about order and repetition
  • Permutations
  • With repetition
  • With no repetition
  • Combinations
  • With no repetition
  • With repetition

Was this helpful?

  1. Related Topics

Combinatorics

PreviousSQLNextData Engineering Questions

Last updated 7 months ago

Was this helpful?

Sources:

All about order and repetition

Combination ⇒\Rightarrow⇒when the order does not matter

Permutation ⇒\Rightarrow⇒ when the order does matter

Permutations

With repetition

If we wish to know the number of combinations of a PIN number, we have 10 chances per digit (due to numbers from 0 to 9).

So, we got 10⋅10⋅10⋅1010 \cdot 10 \cdot 10 \cdot 1010⋅10⋅10⋅10 possibilities (n⋅n⋅n⋅nn \cdot n \cdot n \cdot nn⋅n⋅n⋅n). Thus, generalizing we get the formula:

NRR items selected out of NN^R \\ R\text{ items selected out of } NNRR items selected out of N

With no repetition

Generalizing, we get:

In the particular case, that R is equal to N, we get up to N! possibilities.

Combinations

With no repetition

Here, we can discover the number of possible results in lotteries. We take R from N unique items. However, we don't mind the order of the taken items.

With repetition

I want to distribute 5 balls into 3 urns. As before, take 5 balls and 2 dividers. Visually:

|ooo|oo

In this order, we'd have nothing in the first urn, three in the second urn and two balls in the third urn. The question then is how many ways can we arrange these 5 balls and two dividers?

If what to know all the order of N pool balls be in, for each ball taken we have Ni+1=Ni−1 N_{i+1} = N_i - 1 Ni+1​=Ni​−1 possibilities.

N!(N−R)!R items selected out of N with no repetition\frac{N!}{(N-R)!} \\ R \text{ items selected out of } N \text{ with no repetition}(N−R)!N!​R items selected out of N with no repetition

We have to divide the possible N!N!N! elements by the (N−R)!(N-R)!(N−R)! possible repetitions and the R!R!R! possible combination of the taken items. This is expressed as:

C(n,r)=nCr=(NR)=N!R!(N−R)!C(n, r) = nCr = \binom{N}{R} = \frac{N!}{R!(N-R)!}C(n,r)=nCr=(RN​)=R!(N−R)!N!​

This problem comes by many names - stars and stripes, balls and urns - it's basically a question of how to distribute NNN objects (call them "balls") into RRR categories (call them "urns"). We can think of it as follows.

Take NNN balls and R−1R-1R−1 dividers. If a ball falls between two dividers, it goes into the corresponding urn. If there's nothing between two dividers, then there's nothing in the corresponding urn. Let's look at this with a concrete example.

(R+N−1R)=(R+N−1N−1)=(R+N−1)!R!(N−1)!\binom{R+N-1}{R} = \binom{R+N-1}{N-1} = \frac{(R+N-1)!}{R! (N-1)!}(RR+N−1​)=(N−1R+N−1​)=R!(N−1)!(R+N−1)!​
Combinatorics (Mathigon)
Combinations and Permutation (Math is Fun)
Math Stack Exchange Question