Company: Infoedge_IITM
Difficulty: medium
Why are activation functions used in Neural Networks? For faster convergence To introduce non linearity Both i) and ii) None of the above In which neural net architecture, does weight sharing occur? Convolutional Neural Network Artificial Neural Network Deep Belief Network Both A and B Dropout is used to solve which of the following issues? Under-fitting Over-fitting Both 1 and 2 None of the above Which of the following Networks uses skip connections with all layers connected to each other using identity mapping? ResNet AlexNet Boltzmann Machine DenseNet Which of the following techniques perform similar operations as dropout in a neural network? Bagging Boosting Stacking None of the above Skip Gram and continuous bag-of-words are two architectures of glove model word2vec model latent semantic indexing None of the Above Which of the following is TRUE for CBOW and Skip Gram (SG) word2vec models? CBOW uses context to predict the word in the middle, while SG uses input word to predict the