This course will cover how you can use NLP to do stuff.
There are four videos
- Overview and Converting Text to Vectors
- For finding similar documents
- "I have this document or text, what others talk about the same stuff?"
- Video
- Learning with Vectors and Classification
- For classifying documents
- "I need to put these documents into buckets."
- Video
- Visualizing
- For seeing what document vectors look like in 3D space
- "I need to quickly see what looks similar to what."
- Video
- Sequence Generation and Extracting Pieces of Information from Text
- For translation and document summarization, and for pulling out sentences and documents that talk about specific things
- "I need every mention of a street address or business in Garland, Texas; and I need each document translated to Urdu."
- Video
The idea is we make short videos that focus on the aspects of NLP that currently work well and are useful.
Speech-to-text now works pretty well, so these methods will also be useful for the audio portions of videos.
All code will be available on GitHub here https://github.com/jmugan/modern_practical_nlp
- PhD in Computer Science in 2010 from UT Austin
- Thesis work was about how a robot could wake up in the world and figure out what is going on
- Work at DeUmbra where we build AI for the DoD
- We also work in healthcare, which I can talk about. A future video (not in this series) will cover how we use graph neural networks to identify who is at risk for opioid overdose
- Hands-on technical advisor at our sister company Pulselight
- Wrote The Curiosity Cycle: Preparing Your Child for the Ongoing Technological Explosion
- Advisor at KUNGFU.AI
- Also do independent consulting work
- Can find me here jonathanwilliammugan@gmail.com or on Twitter at @jmugan
- Reading requires mapping language to internal concepts grounded in behaving in the same general environment as the writer.
- Computers don’t have those concepts.
- Example: “I pulled the wagon.” Computers don’t know that wagons can carry things or that pulling exerts a gentle tension to the arm and leg muscles as one walks.
- Writing requires mapping internal concepts grounded in behaving in the same general environment as the expected reader.
- Computers don’t have those concepts
- NLP is about how to make natural language amenable to computation even though computers can’t read or write.
- Representing text as vectors has transformed NLP in the last 10 years.
- There are also symbolic methods that are practically useful; we will cover those too.
- Microsoft Flight Simulator 2020 is an inflection point for virtual worlds and our own
- Generating Natural-Language Text with Neural Networks
- Why Is There Life? and What Does It Have to Do with AI?
- Chatbots: Theory and Practice
- You and Your Bot: A New Kind of Lifelong Relationship
- Computers Could Understand Natural Language Using Simulated Physics
- The Two Paths from Natural Language Processing to Artificial Intelligence
- DeepGrammar: Grammar Checking Using Deep Learning
- Deep Learning for Natural Language Processing
- What Deep Learning Really Means
- My O'Reilly course on NLP: Natural Language Text Processing with Python