8085 Microprocessor Architecture & Assembly Language Fundamentals
Bus Organization in 8085 Microprocessors
A bus in the 8085 microprocessor is a group of wires used for communication between different components. There are three main types:
- Data Bus: Carries actual data, like a delivery van.
- Address Bus: Carries the memory address to access data, like a GPS.
- Control Bus: Carries control signals (e.g., read/write instructions). These signals coordinate data movement between the CPU, memory, and I/O devices.
Memory Addressing & Mapping Fundamentals
Memory addressing
Read MoreNLP Foundations: From Text Processing to Large Language Models
Week 1: Working with Words
Tokenization:
Splitting text into discrete units (tokens), typically words or punctuation . Techniques vary (simple split on whitespace vs. Advanced tokenizers); challenges include handling punctuation, contractions, multi-
word names, and different languages (e.G., Chinese has no spaces). Good tokenization is foundational for all NLP tasks.Bag-of-Words (BoW):
Representing a document by the counts of each word in a predefined vocabulary, ignoring order . The vocabulary is
Cybersecurity Essentials: Concepts & Best Practices
Vulnerability & Patch Management
(Domain 4 Concepts)
Common Vulnerability Scoring System (CVSS)
- Rates vulnerabilities on a scale of 0-10 for severity.
Exploit vs. Zero-Day
- Exploit: A known attack method against a vulnerability.
- Zero-Day: A vulnerability that has no patch available yet.
Remediation vs. Mitigation
- Remediation: Completely fixing a vulnerability.
- Mitigation: Applying temporary protections while awaiting a full fix.
Common Scanning Tools
- Nessus / OpenVAS: Identify known vulnerabilities.
- Nikto:
Quantitative Methods & Machine Learning Essentials
Likelihood Function
The likelihood function describes how observed data depends on the model parameters, θ. It is often denoted as p(x|θ).
Maximum Likelihood Estimation (MLE)
The Maximum Likelihood Estimator (MLE), δ(x), is defined as:
or equivalently:
The Gaussian log-likelihood is shown above.
Asymptotic Distribution of MLE
When n is large, the asymptotic distribution of the MLE is given by:
where:
Examples include:
- Gaussian:
- Exponential:
Bayesian Estimation
The Bayesian forecast is given by:
Assume the
Read MoreProbabilistic Reasoning & AI Decision Making
Probabilistic Inference
What Is Probability?
A measure
Passigning each eventAa number in[0,1].Axioms of Probability
P(∅) = 0,P(Ω) = 1For disjoint
A, B,P(A∪B) = P(A) + P(B)
Conditional Probability:
P(A|B) = P(A∧B) / P(B)
Bayesian Inference Fundamentals
Bayes’ Rule
P(H|E) = [P(E|H) * P(H)] / P(E)Key Terms in Bayes’ Rule
Prior
P(H): initial beliefLikelihood
P(E|H)Posterior
P(H|E): updated belief
Bayesian Networks (BNs)
Structure: Directed acyclic graph; nodes = variables; edges = dependencies.
Essential Machine Learning Concepts Explained
Regularization Techniques in Machine Learning
Regularization is a set of techniques used to reduce overfitting by encouraging simpler models. It works by adding a penalty term to the loss function that discourages overly large or complex weights. The two most common forms are L2 (Ridge) and L1 (Lasso) regularization.
L2 Regularization (Ridge / Weight Decay)
L2 Regularization adds the squared L2 norm of the weights to the loss: L(w) = original loss + λ‖w‖². This shrinks weights proportionally
Read More