My Learning
Cart
Sign In
Categories
Current Affairs & GK
Current Affairs
Show All Current Affairs & GK
eBooks
General Aptitude
Arithmetic Aptitude
Data Interpretation
Show All General Aptitude
General Knowledge
Basic General Knowledge
General Science
Show All General Knowledge
Medical Science
Anatomy
Biochemical Engineering
Biochemistry
Biotechnology
Microbiology
Show All Medical Science
Technical
Database
Digital Electronics
Electronics
Networking
Show All Technical
Verbal and Reasoning
Logical Reasoning
Verbal Ability
Verbal Reasoning
Show All Verbal and Reasoning
Which of the following is a common activation function used in hidden layers of
Practice Questions
Q1
Which of the following is a common activation function used in hidden layers of neural networks?
Softmax
ReLU
Mean Squared Error
Cross-Entropy
Questions & Step-by-Step Solutions
Which of the following is a common activation function used in hidden layers of neural networks?
Steps
Concepts
Step 1: Understand what an activation function is. It helps a neural network learn by introducing non-linearity.
Step 2: Learn about different types of activation functions. Some common ones are Sigmoid, Tanh, and ReLU.
Step 3: Identify which activation function is commonly used in hidden layers. ReLU (Rectified Linear Unit) is widely used.
Step 4: Know why ReLU is popular. It is simple to compute and helps with faster training of the neural network.
No concepts available.
Soulshift Feedback
×
On a scale of 0–10, how likely are you to recommend
The Soulshift Academy
?
0
1
2
3
4
5
6
7
8
9
10
Not likely
Very likely
✕
↑