Abstract is missing.
- Towards an Algorithm Selection Standard: Data Format and ToolsLars Kotthoff. 1 [doi]
- Bayesian Optimization for More Automatic Machine LearningFrank Hutter. 2 [doi]
- Using Meta-Learning to Initialize Bayesian Optimization of HyperparametersMatthias Feurer, Jost Tobias Springenberg, Frank Hutter. 3-10 [doi]
- Similarity Measures of Algorithm Performance for Cost-Sensitive ScenariosCarlos Eduardo Castor de Melo, Ricardo B. C. Prudêncio. 11-17 [doi]
- Using Metalearning to Predict When Parameter Optimization Is Likely to Improve Classification AccuracyParker Ridd, Christophe G. Giraud-Carrier. 18-23 [doi]
- Surrogate Benchmarks for Hyperparameter OptimizationKatharina Eggensperger, Frank Hutter, Holger H. Hoos, Kevin Leyton-Brown. 24-31 [doi]
- A Framework To Decompose And Develop MetafeaturesFábio Pinto, Carlos Soares, João Mendes-Moreira. 32-36 [doi]
- Towards Meta-learning over Data StreamsJan N. van Rijn, Geoffrey Holmes, Bernhard Pfahringer, Joaquin Vanschoren. 37-38 [doi]
- Recommending Learning Algorithms and Their Associated HyperparametersMichael R. Smith, Logan Mitchell, Christophe G. Giraud-Carrier, Tony R. Martinez. 39-40 [doi]
- An Easy to Use Repository for Comparing and Improving Machine Learning Algorithm UsageMichael R. Smith, Andrew White, Christophe G. Giraud-Carrier, Tony R. Martinez. 41-48 [doi]
- Measures for Combining Accuracy and Time for Meta-learningSalisu Abdulrahman, Pavel Brazdil. 49-50 [doi]
- Determining a Proper Initial Configuration of Red-Black Planning by Machine LearningOtakar Trunda, Roman Barták. 51-52 [doi]
- Hybrid Multi-Agent System for Metalearning in Data MiningKlára Pesková, Jakub Smíd, Martin Pilát, Ondrej Kazík, Roman Neruda. 53-54 [doi]
- Model Selection in Data Analysis CompetitionsDavid Kofoed Wind, Ole Winther. 55-60 [doi]