User contributions
Jump to navigation
Jump to search
- 04:49, 15 March 2021 diff hist +725 Uniquely Decodable Codes →Proof for Prefix-Free Codes
- 04:40, 15 March 2021 diff hist +555 Uniquely Decodable Codes →Proof for Prefix-Free Codes
- 04:33, 15 March 2021 diff hist +3,725 N Uniquely Decodable Codes Created page with "== Special Case: Prefix-Free Codes == == The Kraft-McMillan Inequality == The Kraft-McMillan inequality provides (1) a necessary condition for unique decodability, and (2) a..."
- 03:58, 15 March 2021 diff hist +332 Huffman codes →Upper bound on Expected Length
- 03:54, 15 March 2021 diff hist +263 Huffman codes →Upper bound on Expected Length
- 03:50, 15 March 2021 diff hist +55 Huffman codes →Upper bound on Expected Length
- 03:43, 15 March 2021 diff hist +1,224 Huffman codes
- 03:23, 15 March 2021 diff hist +1,220 Huffman codes →Optimality
- 03:10, 15 March 2021 diff hist +11 Huffman codes →Achieving Shannon's limit
- 03:09, 15 March 2021 diff hist +104 Huffman codes →Achieving Shannon's limit
- 03:06, 15 March 2021 diff hist +2,054 N Huffman codes Created page with "== Construction == File:Huffman-example.svg == Optimality == The optimality of the Huffman code hinges on the following two properties of optimal prefix-free codes: * *..."
- 02:35, 15 March 2021 diff hist +350 Source coding →Source codes
- 02:29, 15 March 2021 diff hist +160 Source coding →Source codes
- 02:24, 15 March 2021 diff hist +961 Source coding
- 02:11, 15 March 2021 diff hist -3 Source coding →Source codes
- 02:08, 15 March 2021 diff hist -9 m Source coding →Source codes
- 01:57, 15 March 2021 diff hist +275 Source coding →Source codes
- 01:48, 15 March 2021 diff hist +3,793 N Source coding Created page with " == A First Look at Shannon's Communication Theory == thumb|500px|Figure 1: A general communication system<ref name="shannon1948"/>. In his..."
- 01:05, 15 March 2021 diff hist 0 N File:Huffman-example.svg current
- 00:16, 15 March 2021 diff hist 0 N File:Prefix-tree-examples.svg current
- 23:51, 14 March 2021 diff hist 0 N File:Code-hierarchy.svg current
- 23:41, 14 March 2021 diff hist 0 N File:Prefix-code-tree.svg current
- 18:52, 10 March 2021 diff hist +165 161-A1.2 →Exercise 1.2: Conditional entropy (1 point) current
- 21:58, 6 March 2021 diff hist +2 161-A1.2 →Exercise 1.1: Estimating mutual information (9 points)
- 21:40, 6 March 2021 diff hist +487 161-A1.2 →Exercise 1.1: Estimating mutual information (9 points)
- 21:19, 6 March 2021 diff hist -4 CoE 161 S2 AY 2020-2021 →Syllabus
- 21:13, 6 March 2021 diff hist +203 161-A1.2 →Exercise 1.2: Conditional entropy (1 point)
- 21:10, 6 March 2021 diff hist -68 161-A1.2 →Exercise 1: Estimating mutual information
- 13:23, 6 March 2021 diff hist +181 161-A1.2 →Entropy and conditional entropy
- 13:17, 6 March 2021 diff hist +588 Conditional Entropy and Mutual Information →Definition current
- 12:26, 6 March 2021 diff hist +7,274 N Conditional Entropy and Mutual Information Created page with "When we work with multiple information sources (random variables) <math>X</math> and <math>Y</math>, the following details are useful: * How much information overlap is there..."
- 12:23, 6 March 2021 diff hist 0 N File:Venn diagram joint entropy.svg current
- 12:22, 6 March 2021 diff hist 0 N File:Venn diagram.svg current
- 12:13, 6 March 2021 diff hist -15 CoE 161 S2 AY 2020-2021 →Syllabus
- 17:54, 4 March 2021 diff hist +1 Entropy →Definition current
- 17:53, 4 March 2021 diff hist -3 Entropy →The role of proofs in CoE 161
- 17:50, 4 March 2021 diff hist +298 161-A1.2 →Entropy and conditional entropy
- 17:48, 4 March 2021 diff hist -4 161-A1.2 →A unified view of random variables
- 17:47, 4 March 2021 diff hist +2 161-A1.2 →A unified view of random variables
- 17:42, 4 March 2021 diff hist +1 m 161-A1.2 →Entropy and conditional entropy
- 17:39, 4 March 2021 diff hist 0 161-A1.2 →Entropy and conditional entropy
- 23:55, 2 March 2021 diff hist +267 Entropy →Definition
- 23:40, 2 March 2021 diff hist +2 Entropy →Proving the statement for a family of instances
- 23:31, 2 March 2021 diff hist 0 Entropy →Some properties
- 23:30, 2 March 2021 diff hist -12 CoE 161 S2 AY 2020-2021 →Syllabus
- 23:27, 2 March 2021 diff hist +6,838 N Entropy Created page with "== Definition == In information theory, we always work with random variables. In his landmark paper, Shannon introduced a function <math>H</math> that takes in a random variab..."
- 13:33, 2 March 2021 diff hist +286 CoE 161 S2 AY 2020-2021 →Syllabus
- 13:26, 2 March 2021 diff hist -31 CoE 161 S2 AY 2020-2021 →Syllabus
- 13:19, 2 March 2021 diff hist +4 CoE 161 S2 AY 2020-2021 →Syllabus
- 13:16, 2 March 2021 diff hist +12 161-A1.2 →Entropy and conditional entropy