Duong, Manh KhoiMeyer, HolgerRitter, NorbertThor, AndreasNicklas, DanielaHeuer, AndreasKlettke, Meike2019-04-152019-04-152019978-3-88579-684-8https://dl.gi.de/handle/20.500.12116/21803Tuning hyperparameters can be very counterintuitive and misleading, yet it plays a big (or even the biggest) part in many machine learning algorithms. For instance, finding the right architecture for an artificial neural network (ANN) can also be seen as a hyperparameter e.g. number of convolutional layers, number of fully connected layers etc. Tuning these can be done manually or by techniques such as grid search or random search. Even then finding optimal hyperparameters seems to be impossible. This paper tries to counter this problem by using bayesian optimization, which finds optimal parameters, including the right architecture for ANNs. In our case, a histological image dataset was used to classify breast cancer into stages.enCNNModel ArchitectureBreast CancerHistologyAutomated Architecture-Modeling for Convolutional Neural Networks10.18420/btw2019-ws-171617-5468