분류 전체보기
-
Boltzmann Machine with Energy-Based Models and Restricted Boltzmann machines(RBM)MLAI/DeepLearning 2019. 10. 19. 19:36
1. Overview A Boltzmann machine (also called stochastic Hopfield network with hidden units) is a type of stochastic recurrent neural network and Markov random field. Boltzmann machines can be seen as the stochastic, generative counterpart of Hopfield networks. They were one of the first neural networks capable of learning internal representations, and are able to represent and (given sufficient ..
-
Asterisk(*) of PythonDynamicPL/Python 2019. 10. 18. 16:15
1. Overview There are 4 cases for using the asterisk in Python. For multiplication and power operations. For repeatedly extending the list-type containers. For using the variadic arguments. (so-called “packing”) For unpacking the containers. Let’s look at each case. 2. Description 2.1 For multiplication and power operations >>> 2 * 3 6 >>> 2 ** 3 8 >>> 1.414 * 1.414 1.9993959999999997 >>> 1.414 ..
-
Data Structure in PythonDynamicPL/Python 2019. 10. 18. 14:54
1. Overview There are quite a few data structures available. The builtins data structures are lists, tuples, dictionaries, strings, sets, and frozensets. Lists, strings, and tuples are ordered sequences of objects. Unlike strings that contain only characters, list and tuples can contain any type of objects. Lists and tuples are like arrays. Tuples like strings are immutables. Lists are mutables ..
-
Classify Deep LearningMLAI/DeepLearning 2019. 10. 16. 22:01
1. Overview Supervised learning is the machine learning task of learning a function that maps an input to an output based on example input-output pairs. It infers a function from labeled training data consisting of a set of training examples. In supervised learning, each example is a pair consisting of an input object (typically a vector) and the desired output value (also called the supervisory..
-
Softmax and Cross-Entropy with CNNMLAI/DeepLearning 2019. 10. 16. 15:56
1. Overview 2. Description How come two output values add up to one? 2.1 Softmax function(Normalized exponential function) $$f_{j}(z)=\frac{e^{zj}}{\sum_{k}e^{zk}}$$ Normally, the dog and the cat neurons would have any kind of real values. Applying the softmax function which is written up over there at the top, and that would bring these values to be between zero and one and it would make them a..
-
Bayes' Rule(Bayes' Theorem, Bayes' Law)Math/Probability 2019. 10. 14. 18:55
1. Overview In probability theory and statistics, Bayes’ theorem (alternatively Bayes’ law or Bayes’ rule) describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if cancer is related to age, then, using Bayes’ theorem, a person's age can be used to more accurately assess the probability that they have cancer than can be done..
-
Dependent, independent event, and conditional probabilityMath/Probability 2019. 10. 14. 15:14
1. Overview 2. Description 2.1 Independent event The theoretical probability remains unaffected by other events 2.2 dependent event(=Conditional probability) $$P(A\: |\: B)$$ The probability of getting A, if we are given that B has occurred A given B 3. Example 4. References https://365datascience.co