Naar inhoud springen

Machine Learning and Inductive Inference

Uit Atlas Examenwiki
Versie door Domino (overleg | bijdragen) op 26 jun 2025 om 22:24 (Creation of page, with exams from the Merkator wiki)
(wijz) ← Oudere versie | Huidige versie (wijz) | Nieuwere versie → (wijz)
Course Information
Courses and exams
ProfThiery Wim
CoursesLectures and exercises
ExaminationOral and written exam
Reports
Background
Credits3
When?1st semester
ECTSVUB

Since 2021-2022 this course was made available for a broader range of study programs.

Exams from before 2021-2022 can be found at the Wina exam wiki

2022

Januari

26/01/22

Small questions:

  1. connect 6 concepts with the right scentence about the concept or their definition
  2. same as previous question
  3. 3 questions where you have to anwser what the concept is.
  4. Which of the three has the largest entropy? {a,a,a,a,a,a} OR {a,b} OR {a,a,a,a,a,b}

Big questions:

  1. 12 examples given where 4 attributes(2 or three possible values per attribute) predict 1 class (3 possible values). A new instance their 4 attributes are given.
    • Classify the new instance according to naive bayes (m=2, q=1/2)
    • Classify the new instance according 4NN. (calculate the distance from the examples).
  2. The hypothesis "Tony likes pizza" is formulated with 0, 1 or 2 literals at most. The possible attributes that predict the hypothesis are {mushrooms, ham, tomatoes, onion}. Example that is valid"TOny likes pizza with mushrooms and tomatoes" or for 0 literals "Tony likes all pizza". Invalid is "TOny likes pizza with mushrooms or ham".
    • Show that the VC dimension of this hypothesis space is at most 3.
    • Show that the VC dimension of this hypothesis space is at least 3.
  3. Given 2 literals (something like: p(X,X) <-- q(X,Y)... and p(X,Y) <-- ... (both had a total of 3) )
    • How many variables after combining them to claculate the LGG?
    • What is the LGG (multiple choice a-g)
  4. Calculate the ROC based on the DT below. The proportions in the leaves discribe how the training data is labeled. Assume that a new instance will get classified as positive if the proportion of positives in the leaf is higher then a threshold C. Let the threshold C vary from 0-1 and draw the ROC for this descision tree classifier.