![]() |
Machine Learning BytesAuthor: Erik Partridge
Short, simple summaries of machine learning topics, to help you prepare for exams, interviews, reading the latest papers, or just a quick brush up. In less than two minutes, we'll cover the most obscure jargon and complex topics in machine learning. For more details, including small animated presentations, please visit erikpartridge.com. Please do join the conversation on Twitter for corrections, conversations, support and more at #mlbytes Language: en Genres: Technology Contact email: Get it Feed URL: Get it iTunes ID: Get it |
Listen Now...
K-Fold Cross Validation
Episode 6
Wednesday, 31 July, 2019
K-fold cross validation is the practice by which we separate a large data set into smaller pieces, independently process each data set, and then train our models on some number of the segments, and validate it on the rest. This is generally considered a best practice, or at least good practice, in machine learning, as it helps ensure the correct characterization of your model on the validation set. Machine Learning Mastery has a great post on the topic.










