{"id":550,"date":"2017-12-11T16:33:31","date_gmt":"2017-12-11T16:33:31","guid":{"rendered":"http:\/\/www.nullplug.org\/ML-Blog\/?p=550"},"modified":"2017-12-11T16:33:31","modified_gmt":"2017-12-11T16:33:31","slug":"problem-set-8","status":"publish","type":"post","link":"http:\/\/www.nullplug.org\/ML-Blog\/2017\/12\/11\/problem-set-8\/","title":{"rendered":"Problem Set 8"},"content":{"rendered":"<h2>Problem Set 8<\/h2>\n<p>This is to be completed by December 14th, 2017. There will be no exercise session this week.<\/p>\n<h3>Exercises<\/h3>\n<ol>\n<li><a href=\"https:\/\/www.datacamp.com\/home\">Datacamp<\/a>\n<ul>\n<li>Complete the lesson:<br \/>\na. Beginning Bayes in R<\/li>\n<\/ul>\n<\/li>\n<\/ol>\n","protected":false},"excerpt":{"rendered":"<p>Problem Set 8 This is to be completed by December 14th, 2017. There will be no exercise session this week. Exercises Datacamp Complete the lesson: a. Beginning Bayes in R<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"advanced_seo_description":"","jetpack_seo_html_title":"","jetpack_seo_noindex":false,"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-550","post","type-post","status-publish","format-standard","hentry","category-general"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p9dIpN-8S","jetpack_likes_enabled":true,"jetpack-related-posts":[{"id":558,"url":"http:\/\/www.nullplug.org\/ML-Blog\/2018\/01\/09\/problem-set-10\/","url_meta":{"origin":550,"position":0},"title":"Problem Set 10","author":"Justin Noel","date":"January 9, 2018","format":false,"excerpt":"Problem Set 10 This is to be completed by January 11th, 2018. Exercises Datacamp Complete the lesson: a. Intro to Python for Data Science During this week's problem session I will provide an introduction to Python.","rel":"","context":"In &quot;General&quot;","block_context":{"text":"General","link":"http:\/\/www.nullplug.org\/ML-Blog\/category\/general\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":579,"url":"http:\/\/www.nullplug.org\/ML-Blog\/2018\/01\/26\/problem-set-13\/","url_meta":{"origin":550,"position":1},"title":"Problem Set 13","author":"Justin Noel","date":"January 26, 2018","format":false,"excerpt":"Problem Set 13 This is to be completed by February 1st, 2018. Exercises Datacamp * Complete the lesson: a. Python Data Science Toolbox (Part II) For a logistic regressor (multiclass ending in softmax) write down the update rules for gradient descent. For a two layer perceptron ending in softmax with\u2026","rel":"","context":"In &quot;General&quot;","block_context":{"text":"General","link":"http:\/\/www.nullplug.org\/ML-Blog\/category\/general\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":538,"url":"http:\/\/www.nullplug.org\/ML-Blog\/2017\/11\/24\/problem-set-6\/","url_meta":{"origin":550,"position":2},"title":"Problem Set 6","author":"Justin Noel","date":"November 24, 2017","format":false,"excerpt":"Problem Set 6 This is to be completed by November 30th, 2017. Exercises Datacamp Complete the lesson: a. Text Mining: Bag of Words Exercises from Elements of Statistical Learning Complete exercises: a. 4.2 b. 4.6 Run the perceptron learning algorithm by hand for the two class classification problem with $(X,Y)$-pairs\u2026","rel":"","context":"In &quot;General&quot;","block_context":{"text":"General","link":"http:\/\/www.nullplug.org\/ML-Blog\/category\/general\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":555,"url":"http:\/\/www.nullplug.org\/ML-Blog\/2017\/12\/18\/problem-set-9\/","url_meta":{"origin":550,"position":3},"title":"Problem Set 9","author":"Justin Noel","date":"December 18, 2017","format":false,"excerpt":"Problem Set 9 This is to be completed by December 21st, 2017. Exercises Datacamp Complete the lesson: a. Intermediate R: Practice R Lab: Consider a two class classification problem with one class denoted positive. Given a list of probability predictions for the positive class, a list of the correct probabilities\u2026","rel":"","context":"In &quot;General&quot;","block_context":{"text":"General","link":"http:\/\/www.nullplug.org\/ML-Blog\/category\/general\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":531,"url":"http:\/\/www.nullplug.org\/ML-Blog\/2017\/11\/17\/problem-set-5\/","url_meta":{"origin":550,"position":4},"title":"Problem Set 5","author":"Justin Noel","date":"November 17, 2017","format":false,"excerpt":"Problem Set 5 This is to be completed by November 23rd, 2017. Exercises Datacamp Complete the lesson: a. Machine Learning Toolbox R Lab: Write a function in R that will take in a vector of discrete variables and will produce the corresponding one hot encodings. Write a function in R\u2026","rel":"","context":"In &quot;General&quot;","block_context":{"text":"General","link":"http:\/\/www.nullplug.org\/ML-Blog\/category\/general\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":563,"url":"http:\/\/www.nullplug.org\/ML-Blog\/2018\/01\/13\/problem-set-11\/","url_meta":{"origin":550,"position":5},"title":"Problem Set 11","author":"Justin Noel","date":"January 13, 2018","format":false,"excerpt":"Problem Set 11 This is to be completed by January 18th, 2018. Exercises Datacamp Complete the lesson: a. Intermediate Python for Data Science What is the maximum depth of a decision tree trained on $N$ samples? If we train a decision tree to an arbitrary depth, what will be the\u2026","rel":"","context":"In &quot;General&quot;","block_context":{"text":"General","link":"http:\/\/www.nullplug.org\/ML-Blog\/category\/general\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]}],"_links":{"self":[{"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/posts\/550","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/comments?post=550"}],"version-history":[{"count":1,"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/posts\/550\/revisions"}],"predecessor-version":[{"id":551,"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/posts\/550\/revisions\/551"}],"wp:attachment":[{"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/media?parent=550"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/categories?post=550"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.nullplug.org\/ML-Blog\/wp-json\/wp\/v2\/tags?post=550"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}