mathematicalmonk
mathematicalmonk
  • 262
  • 9 909 297
(ML 2.5) Generalizations for trees (CART)
Other "impurity" quantities (entropy and Gini index), and generalizations of decision trees for classification and regression using the CART approach.
A playlist of these Machine Learning videos is available here:
ua-cam.com/users/my_playlists?p=D0F06AA0D2E8FFBA
Переглядів: 34 346

Відео

(ML 2.4) Growing a classification tree (CART)
Переглядів 37 тис.11 років тому
How to build a decision tree for classification using the CART approach. A playlist of these Machine Learning videos is available here: ua-cam.com/users/my_playlists?p=D0F06AA0D2E8FFBA
(IC 5.14) Finite-precision arithmetic coding - Decoder
Переглядів 10 тис.12 років тому
Pseudocode for the arithmetic coding decoder, using finite-precision. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.13) Finite-precision arithmetic coding - Encoder
Переглядів 6 тис.12 років тому
Pseudocode for the arithmetic coding encoder, using finite-precision. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.12) Finite-precision arithmetic coding - Setup
Переглядів 6 тис.12 років тому
Pre-defining the quantities that will be needed in the finite-precision algorithm. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.11) Finite-precision arithmetic coding - Rescaling
Переглядів 6 тис.12 років тому
We integrate the rescaling operations into the infinite-precision encoder, as a precursor to the finite-precision encoder. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.10) Generalizing arithmetic coding to non-i.i.d. models
Переглядів 3,8 тис.12 років тому
Arithmetic coding can accommodate essentially any probabilistic model of the source, in a very natural way. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.9) Computational complexity of arithmetic coding
Переглядів 6 тис.12 років тому
Arithmetic coding is linear time in the length of the source message and the encoded message. Since the encoded message length is near optimal on average, the expected time is near optimal. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.8) Near optimality of arithmetic coding
Переглядів 3,8 тис.12 років тому
The expected encoded length of the entire message is within 2 bits of the ideal encoded length (the entropy), assuming infinite precision. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.7) Decoder for arithmetic coding (infinite-precision)
Переглядів 7 тис.12 років тому
Pseudocode for the arithmetic coding algorithm, assuming addition and multiplication can be done exactly (i.e. with infinite precision). Later we modify this to work with finite precision. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.4) Why the interval needs to be completely contained
Переглядів 6 тис.12 років тому
To ensure unique decodeability, it's necessary that the interval [a,b) contain the whole interval corresponding to the encoded binary sequence, rather than just containing the number corresponding to the binary sequence. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.6) Encoder for arithmetic coding (infinite-precision)
Переглядів 8 тис.12 років тому
Pseudocode for the arithmetic coding algorithm, assuming addition and multiplication can be done exactly (i.e. with infinite precision). Later we modify this to work with finite precision. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.5) Rescaling operations for arithmetic coding
Переглядів 8 тис.12 років тому
Certain rescaling operations are convenient for the infinite-precision algorithm, and are critical for the finite-precision algorithm. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.3) Arithmetic coding - Example #2
Переглядів 18 тис.12 років тому
A simple example to illustrate the basic idea of arithmetic coding. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.2) Arithmetic coding - Example #1
Переглядів 85 тис.12 років тому
A simple example to illustrate the basic idea of arithmetic coding. A playlist of these videos is available at: ua-cam.com/play/PLE125425EC837021F.html
(IC 5.1) Arithmetic coding - introduction
Переглядів 59 тис.12 років тому
(IC 5.1) Arithmetic coding - introduction
(IC 4.12) Optimality of Huffman codes (part 7) - existence
Переглядів 1,8 тис.12 років тому
(IC 4.12) Optimality of Huffman codes (part 7) - existence
(IC 4.13) Not every optimal prefix code is Huffman
Переглядів 3,2 тис.12 років тому
(IC 4.13) Not every optimal prefix code is Huffman
(IC 4.11) Optimality of Huffman codes (part 6) - induction
Переглядів 2,4 тис.12 років тому
(IC 4.11) Optimality of Huffman codes (part 6) - induction
(IC 4.10) Optimality of Huffman codes (part 5) - extension lemma
Переглядів 2,1 тис.12 років тому
(IC 4.10) Optimality of Huffman codes (part 5) - extension lemma
(IC 4.9) Optimality of Huffman codes (part 4) - extension and contraction
Переглядів 2 тис.12 років тому
(IC 4.9) Optimality of Huffman codes (part 4) - extension and contraction
(IC 4.8) Optimality of Huffman codes (part 3) - sibling codes
Переглядів 1,8 тис.12 років тому
(IC 4.8) Optimality of Huffman codes (part 3) - sibling codes
(IC 4.7) Optimality of Huffman codes (part 2) - weak siblings
Переглядів 2,8 тис.12 років тому
(IC 4.7) Optimality of Huffman codes (part 2) - weak siblings
(IC 4.6) Optimality of Huffman codes (part 1) - inverse ordering
Переглядів 6 тис.12 років тому
(IC 4.6) Optimality of Huffman codes (part 1) - inverse ordering
(IC 4.5) An issue with Huffman coding
Переглядів 3,9 тис.12 років тому
(IC 4.5) An issue with Huffman coding
(IC 4.4) Weighted minimization with Huffman coding
Переглядів 4 тис.12 років тому
(IC 4.4) Weighted minimization with Huffman coding
(IC 4.3) B-ary Huffman codes
Переглядів 10 тис.12 років тому
(IC 4.3) B-ary Huffman codes
(IC 4.2) Huffman coding - more examples
Переглядів 20 тис.12 років тому
(IC 4.2) Huffman coding - more examples
(IC 4.1) Huffman coding - introduction and example
Переглядів 119 тис.12 років тому
(IC 4.1) Huffman coding - introduction and example
(IC 3.10) Relative entropy as the mismatch inefficiency
Переглядів 5 тис.13 років тому
(IC 3.10) Relative entropy as the mismatch inefficiency

КОМЕНТАРІ

  • @blessaddo8042
    @blessaddo8042 4 дні тому

    Any recommend Book Sir

  • @rajarshibasak559
    @rajarshibasak559 6 днів тому

    Thanks! my basics is more clear!!

  • @muzaffergurersalan8529
    @muzaffergurersalan8529 7 днів тому

    How can we assume a,b are known?

  • @nijiasheng711
    @nijiasheng711 10 днів тому

    For those who may be confused about the underlying logic flow, the prerequisites are bayes theorem, conditional probability, conditional independence, D-separation algorithm, bayes networks

  • @Intense011
    @Intense011 18 днів тому

    pro tip: hold shift to draw straight lines good vid tho

  • @yethusithole4695
    @yethusithole4695 Місяць тому

    Practical to follow, thanks.

  • @PRiKoL1ST1
    @PRiKoL1ST1 Місяць тому

    Why we need "smaller" sigma algebras and not use always powerset as sigma algebra?

  • @sintiahernandez3421
    @sintiahernandez3421 Місяць тому

    ❤ que se encuentra en el ❤❤❤😂❤❤❤❤❤😂llppku 😂 mmslllllMMKkKKKKkzkzkzzkdkskskskskkjkkksskkskkkksksksjJjJKKakSaLaalaeewee esa🇧🇴🚩🇧🇴🇧🇳🇧🇾🇧🇳🇧🇳🇧🇷🇧🇾🇧🇾🇧🇾🇧🇾🇧🇾🇧🇾🇧🇷🇧🇪🇧🇪🇧🇪🇧🇪🇦🇺🟧🚂🚋🚋🚅🚃🏹 m

  • @cagataykaydr3015
    @cagataykaydr3015 2 місяці тому

    Man, 12 years ago you did a better job than recent articles and lessons. I swear I understood the algorithm in less than 5 minutes, which I'm looking for descriptive contents to understand for days. I can't thank you enough, I'm trying to write a new implementation of an image format! If I become Linus Torvalds someday, I'll make sure people will now you helped me a lot haha!

  • @isaac5990
    @isaac5990 2 місяці тому

    why do we need the probability mass function? wouldn't the steps be the same for encoding and decoding for an even distribution?

  • @schoenwettersl
    @schoenwettersl 3 місяці тому

    very nice video. thank you so much!

  • @lancezhang892
    @lancezhang892 3 місяці тому

    Hello ,I have one question,why are mu and theta not hidden variables?

  • @sina48289
    @sina48289 3 місяці тому

    I pressed the like button only for the sounds you created while drawing the lines

  • @HongzhiSun
    @HongzhiSun 3 місяці тому

    The whole series of the lectures are so great and I like them very very much! As you mentioned in one of your lecture, there are three big theorems in information theory: source coding theorem, rate distortion theorem and channel coding theorem. It is my best wish that you can provide video lectures for the contents of the channel coding and rate distortion theorems. Thank you so much for the wonderful lectures!

  • @user-xr9br3jj9q
    @user-xr9br3jj9q 4 місяці тому

    I have returned to this derivation several times over that past many years. This is a clear calculation of the gradient of the log likelihood function for logistic regression. Thank you!

  • @skadoosh296
    @skadoosh296 4 місяці тому

    if C_j is a vector of counts of each category, what does alpha mean?

  • @Kayi_Alp07
    @Kayi_Alp07 4 місяці тому

    Sir your tutor is amazing. But many students are not benefiting from it. So for you to help them find your tutor please first add profile to your youtube channel and for your videos add thumbnails on it.

  • @provocateach
    @provocateach 5 місяців тому

    This is like the best algorithm ever designed. So many nice properties, from being able to derive the length of the source string from the interval while decoding, to this elegant entropy property. Utterly beautiful.

  • @Nikifuj908
    @Nikifuj908 5 місяців тому

    You had me at “this is overhead”. Number version is bloat, just like Microsoft telemetry

  • @alicetang8009
    @alicetang8009 5 місяців тому

    Sorry the way you pronounce theta and data really confuses me 😶‍🌫

  • @lordcasper3357
    @lordcasper3357 5 місяців тому

    this is so good man

  • @merterdem9345
    @merterdem9345 5 місяців тому

    why is the differential equal to the term inside the infinte sum?

  • @ebubechukwuenwerem9145
    @ebubechukwuenwerem9145 5 місяців тому

    Just one example with real values and data would have helped alot

  • @andrashorvath2411
    @andrashorvath2411 6 місяців тому

    Fantastic presentation, easy to follow and giving great intuition, thank you.

  • @ThePlacehole
    @ThePlacehole 6 місяців тому

    It's videos like these that make you wish youtube had a 3x playback speed...

  • @cypherecon5989
    @cypherecon5989 7 місяців тому

    The first column of the desing matrix should contain only 1s, shouldnt it?

  • @Im_a_weirdo666
    @Im_a_weirdo666 7 місяців тому

    What did he say

  • @mohsinjunaid8454
    @mohsinjunaid8454 8 місяців тому

    super intuitive Thanks

  • @asmaaasmaa2395
    @asmaaasmaa2395 8 місяців тому

    Why B=2?? In example a!

  • @gennadyshutkov1912
    @gennadyshutkov1912 8 місяців тому

    Hello, I have watched the whole series of lectures about arithmetic encoding and I'm very thrilled about this algorithm and your way of explaining things. You are a great professor and mathematician. Thank you for your lectures! Unfortunately my finite precision implementation works incorrectly on large input data (maybe due to round off error). So maybe you cold share your implementation of the algorithm. It would be a great help! Thank you in advance!

    • @HongzhiSun
      @HongzhiSun 3 місяці тому

      Set the probability distribution so that each p(x_i) has the form 1/2**m with m>=0. It seems the algorithm work for long sequence in this case. I guess this can get rid of the round off error issue but I haven't proved it yet. In addition, any places like "< half" or ">half" or ">quarter and <3*quarter" should use <= and >= instead.

  • @amirreza08
    @amirreza08 9 місяців тому

    It was one of the best explanations, so informative and helpful. Thank you!

  • @pedrocolangelo5844
    @pedrocolangelo5844 9 місяців тому

    Your enthusiastic way of teaching is so inspiring. Thank you for sharing this great video!

  • @Celdorsc2
    @Celdorsc2 9 місяців тому

    With regrets I have to say, these are hardly good videos to learn this topic from. It is not the first video that makes me completely confused about where things come from or why certain terms appear in the main or side equations. It's not well organised IMO. I have made a few attempts to come back to the playlist and give it a another go. First I thought I was missing basics. Now, I am convinced the videos are problems. The idea is great, the effort is still appreciated, but sadly it's hard to notice or understand things if they are not explained.

  • @adelecutler1195
    @adelecutler1195 9 місяців тому

    You cite a study but you do not cite the inventor of Random Forests, Leo Breiman.

  • @damoonrobatian9371
    @damoonrobatian9371 9 місяців тому

    I like these videos but this one was very confusing!

  • @denemesistemi
    @denemesistemi 10 місяців тому

    yes this seems good, i agree on that topic

  • @dineshds5124
    @dineshds5124 10 місяців тому

    ありがとう、先生👍

  • @philip2.042
    @philip2.042 10 місяців тому

    Good explanation of Poisson and Exponential rv

  • @leukosnanos
    @leukosnanos 10 місяців тому

    I strongly disagree about the naming you are talking at the beginning. Logistic regression ultimately serves as a classification method, but it fits a logistic (or sigmoid) line to the data so it could be thought as a regression process.

  • @stokesarisnet
    @stokesarisnet 11 місяців тому

    Didn't quite follow the step where "the joint was summed over c". What does that mean?

  • @rebeccacarroll8385
    @rebeccacarroll8385 11 місяців тому

    A little dissapointed that Arianna Rosenbluth wasn't included in the list of creators. She was also the first to use the method. Great explanation of MCMC!

  • @QT-yt4db
    @QT-yt4db 11 місяців тому

    I like this name "mathematicalmonk" a lot. Because I think you must have a monk's heart to love mathematics...

  • @BillHaug
    @BillHaug 11 місяців тому

    thank you

  • @bhardwajsatyam
    @bhardwajsatyam 11 місяців тому

    Speaking from personal experience, you explain these concepts so much better than some professor with an h-index of 60 at a well-known university.

  • @michellemaps8103
    @michellemaps8103 11 місяців тому

    sir, may you please make a video for gamma random variables expected values

  • @pan19682
    @pan19682 Рік тому

    It was perfect exposition of the subject many thanks

  • @JReuben111
    @JReuben111 Рік тому

    This vid makes references to watching other vids before proceeding to the next one !

  • @JReuben111
    @JReuben111 Рік тому

    I must admit, I got lost in the details here ... 😢

  • @MachineCarlo
    @MachineCarlo Рік тому

    for the continuous case, of what use is it?

  • @ikhlasulkamal5245
    @ikhlasulkamal5245 Рік тому

    Can you please come back to youtube sir, your videos are great. Sadly im 10 years late to see your videos