main
October 4th, 2025    

CISC 7412X
Main
Files
Syllabus
Links
Homeworks

Notes
0001

PDFs,etc
AI Intro
Data
Math
Differentiation
Backpropagation
backprop.pl

Oracle Primer
MySQL Primer
PostgreSQL Primer
Hadoop/Hive
HBase Primer
Spark Primer
Trino Primer


Feynman's Tips

Homeworks

HW1: Download some big collection of text. I highly recommend Project Gutenberg (google for it; you can download the entire DVD). Part1: Write a parser/reader that will collect probabilities of every word following any other word. Convert everything to lowercase. Punctuation breaks the sequence (e.g. end of setence, etc.). Part2: Using the probabilities from part1, generate text... start with a random word, then randomly (weighted by probabilities) pick the following word, and continue. Email code for the project and a sample of the generated output. Other things to experiment with: gather probabilities of any word following a pair or triplet of words, etc. (originally assigned 20140910)


HW2: Create a neural network with 8 binary inputs, 3 hidden units, and 8 outputs. Your inputs are 8 binary bits, 00000001, 00000010, 00000100, 00001000, 00010000, 00100000, 01000000, 10000000. Your expected outputs match the inputs. In other words, you're giving the network 00000010 and expecting it to produce 00000010, etc. You should use backpropagation for training, but feel free to use whatever other method that works. After successful training, your network should NOT make any mistakes (you give it a 00000010 and it always outputs 00000010, etc.). Now you can cut away the last layer, and you end up with a network that turns your input string into a 3-bit binary representation. Note that 00000010 will not correspond to 010 ('2' in binary). Submit your code along with a log of output...






































© 2006, Particle