【fishexpert个人笔记】学习资料整理

深度学习
机器学习

#1

###2017.10.30日 源码阅读: github链接:https://github.com/johndpope/hcn


#2

##2017.11.01 tf.convert_to_tensor() 函数

import tensorflow as tf
--># We’ll use NumPy matrices in TensorFlow
import numpy as np   
--># Define a 2x2 matrix in 3 different ways
m1 = [[1.0, 2.0], 
      [3.0, 4.0]]
m2 = np.array([[1.0, 2.0], 
               [3.0, 4.0]], dtype=np.float32)
m3 = tf.constant([[1.0, 2.0], 
                  [3.0, 4.0]])

--># Print the type for each matrix
print(type(m1))
print(type(m2))
print(type(m3))

--># Create tensor objects out of the different types
t1 = tf.convert_to_tensor(m1, dtype=tf.float32)
t2 = tf.convert_to_tensor(m2, dtype=tf.float32)
t3 = tf.convert_to_tensor(m3, dtype=tf.float32)

--># Notice that the types will be the same now
print(type(t1))
print(type(t2))
print(type(t3))


#3

####问题: 1、tf.Session()和tf.InteractiveSession()的区别?

2、tf.Session() 、tf.Session.as_default() 、tf.get_default_session()?

3、tf.placeholder() 、tf.constant()、tf.Variable()的区别?

4、tf.Graph()、tf.Graph.as_default()、tf.Graph.device()、tf.Graph.control_dependencies()、tf.Graph.finalized?

5、tf.name_scope()、tf.variable_scope()区别?

6、tf.variable_scope() 和tf.get_variable()的区别?

7、tf.global_variables_initializer() ?


#4

@discobot 智能回复?


#5

你好!想看看我能做什么,说@discobot 显示帮助


#6

@discobot 显示帮助


#7

我现在知道怎么做这些:

@discobot 开始 新用户

开始以下的交互模式:新用户, 高级用户。

@discobot 投 2d6

:game_die: 3,6

@discobot 引用

:left_speech_bubble: 随手做些善举,不要期待任何的回报,肯定能懂得一个道理:总有一天有人会为你做同样的事情。 — 戴安娜公主

@discobot 财富

:crystal_ball: 你可能需要它


#8

###问题: 1、如何理解卷积神经网络(CNN)中的卷积和池化? 参考: https://www.zhihu.com/question/49376084 2、如何通俗易懂地解释卷积? 参考: https://www.zhihu.com/question/22298352 3、直观理解CNN(An Intuitive Explanation of Convolutional Neural Networks): 参考:

4、激活函数对比: 参考: https://zhuanlan.zhihu.com/p/26122560


#9

####概念: 1、Learning Rate 2、Dropout 3、Covariate Shift vs Internal Covariate Shift 4、Batch Normalization 5、Convolution Kernel 6、Pooling 7、Covariate Shift 8、Cost Function


#10

###网络结构: 1、Inception V3 2、inception V2 3、Alexnet 4、GoogleNet 5、VGG家族 6、caffe神经网络链接: http://blog.csdn.net/quincuntial/article/details/72832136 7、Resnet 9、Densenet http://blog.csdn.net/xbinworld/article/details/45619685


#11

###概念: 1、Activation Function 2、Mini-batch SGD/SGD/GD 3、Momentum 4、AdaGrad 5、Adam 6、Sigmoid/ReLU/LReLU、PReLU/RReLU/tanh 7、Softmax + Cross Entropy


#12

###概念 1、什么是狄利克雷分布?狄利克雷过程又是什么? 2、beta distribution? 3、如何通俗易懂地介绍 Gaussian Process? 4、如何用简单易懂的例子解释隐马尔可夫模型? 5、Estimation theory 链接:

6、SVM和logistic回归分别在什么情况下使用?


#13

word2vec优秀blog汇总: 1、Word2Vec Tutorial - The Skip-Gram Model part1 http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/

2、Word2Vec Tutorial Part 2 - Negative Sampling http://mccormickml.com/2017/01/11/word2vec-tutorial-part-2-negative-sampling/

3、word2vec Tutorial https://rare-technologies.com/word2vec-tutorial/


#14

#LDA2vec:Word Embeddings in Topic Models https://towardsdatascience.com/lda2vec-word-embeddings-in-topic-models-4ee3fc4b2843


#15

####Predicting Taxi Demand at Airports in NYC

####Multivariate Time Series Forecasting with LSTMs in Keras https://machinelearningmastery.com/multivariate-time-series-forecasting-lstms-keras/


#16

##Gensim


#17

#18

#19

http://www0.cs.ucl.ac.uk/staff/d.silver/web/Resources.html


#20