Meta tags:
description= I m Chris. I am the Director of Machine Learning at the Wikimedia Foundation. Here are my 57 notes on applied artificial intelligence, based on ~34 sources: Machine Learning Training, Test, And Valid…;
Headings (most frequently used words):
-
Text of the page (most frequently used words):
#data (6), #learning (4), and (4), layers (4), encoding (3), #gradient (3), #descent (3), #linear (3), #training (3), #layer (3), #motivation (2), deep (2), models (2), networks (2), neural (2), large (2), relu (2), unit (2), activation (2), the (2), chris (2), machine (2), numbers (2), classes, methods, base, abstract, property, history, example, programming, oriented, object, hinting, type, difference, python, hallucinations, variable, engineering, what, hyperbolic, tangent, tanh, rectified, home, leaky, dataset, concurrency, code, language, clean, fine, tuning, software, mnist, random, operations, systems, model, mathematics, algebra, broadcasting, vector, tensors, declarative, probability, law, sample, space, central, limit, theorem, lake, between, three, from, extracting, class, handling, instance, variables, tenets, warehouse, science, oop, computer, inheritance, encapsulation, abstraction, mlops, polymorphism, max, gelu, types, sources, test, validation, sets, reinforcement, four, intelligence, dimensionality, reduction, feature, scaling, clustering, regression, classification, based, artificial, underfitting, foundation, short, 20bio, director, wikimedia, https, applied, wikimediafoundation, org, here, are, notes, overfitting, capacity, error, convolutional, how, learn, batching, skip, connections, batch, normalization, embedding, bias, pooling, fully, connected, dropout, functions, gaussian, autoencoders, positional, pre, double, processing, mutual, information, one, hot, target, stochastic, internal, regularization, covarate, shift, rate, flops, vanishing, weight, decay, albon,
Text of the page (random words):
home chris albon i m chris short 20bio md i am the director of machine learning at the wikimedia foundation https wikimediafoundation org here are my 57 notes on applied artificial intelligence based on 34 sources machine learning training test and validation sets reinforcement learning four types of ai dimensionality reduction feature scaling clustering regression classification overfitting and underfitting capacity pre processing data mutual information one hot encoding target encoding neural networks stochastic gradient descent deep double descent internal covarate shift learning rate flops vanishing gradient weight decay regularization gradient descent positional encoding bias motivation autoencoders how deep neural networks learn batching layers skip connections batch normalization layer convolutional layers embedding layers max pooling layer fully connected layers dropout layer activation functions gaussian error linear unit gelu hyperbolic tangent tanh rectified linear unit relu leaky relu what is an activation motivation large language models fine tuning vs training history hallucinations python type hinting object oriented programming example of property abstract base classes and methods difference between class and instance variables three tenets of oop polymorphism inheritance encapsulation abstraction mlops data warehouse vs data lake declarative ml systems data model mathematics linear algebra broadcasting vector operations tensors probability sample space central limit theorem random variable law of large numbers computer science handling numbers data extracting training data from models mnist dataset software engineering clean code concurrency
|