Computational Cognitive Science

25 views
Skip to first unread message

chengjun wang

unread,
Mar 8, 2013, 8:12:08 AM3/8/13
to computational...@googlegroups.com

Computational Cognitive Science


NOTE: This page is under revision. In particular, we are in the process of checking to see if any of the content posted on this page relies on copyrighted materials belonging to someone else. If you believe that any specific content does violate copyright, please email us and we will check it immediately.

This course (COMPSCI 3016 and 7027) provides an introduction to computational theories of human cognition. We use formal models from artificial intelligence and mathematical psychology to consider fundamental issues in human knowledge representation, inductive reasoning, learning, decision-making and language acquisition. What kind of informational structures describe the organisation of human knowledge, and what kinds of inferences do they license? How do humans make choices given time constraints, computational limitations, and external costs imposed by the world? What kinds of innate knowledge (if any) must people have? And how can formal models of human cognition inform our understanding of the design of intelligent machines?

Introductory Lectures

  • Lecture 1: Introduction to CCS. What is computational cognitive science? Why study it? [pdf, 7.4MB]
  • Lecture 2: Modeling in cognitive science. What makes a good model in cognitive science? What different kinds of modelling paradigms are there? [pdf, 4.7MB]
  • Introduction to Matlab, part 1. Basics of MATLAB: variables, operators, control flow, functions, scripts [pdf, 0.5MB]
  • Lecture 3: Bayesian inference, part 1. Introduction to probability; Bayes Rule; some examples [pdf, 4.5MB]
  • Lecture 4: Bayesian inference, part 2. More advanced Bayesian inference: the Lotto problem[pdf, 3.8MB] ["bonus slides", pdf, 2.0MB]
  • Lecture 5: Introduction to Matlab, part 2. File I/O, graphics, statistical functions [pdf, 0.8MB][code]
  • Lecture 6: Complexity and Ockham's razor. Simplicity via the likelihood and the prior; rectangle world [pdf, 1.1MB]


Language

  • Lecture 7: Introduction to language. Overview of language; phonetic category learning; k-means clustering [pdf, 1.8MB]
  • Lecture 8: Phoneme learning Phonetic category learning continued; mixture of Gaussians; Feldman & Griffiths (2009)[pdf, 1.5MB] [code, 33KB]
  • Lecture 9: Word segmentation, part 1. The problem; use of transition probabilities; n-gram models [pdf, 1.1MB][code, 20KB]
  • Lecture 10: Word segmentation, part 2. Smoothing in -gram models; application to human word segmentation [pdf, 0.7MB][code, 8KB]
  • Lecture 11:Hidden Markov models, part 1 Syntax learning; parts of speech; intro to HMMs; generating from HMMs[pdf, 1.8MB]
  • Lecture 12: Hidden Markov models, part 2 Recap; Viterbi algorithm; Baum-Welch algorithm [pdf, 0.9MB][zip, 4KB]
  • Lecture 13: Context free grammars the Chomsky hierarchy; intro to CFGs and PCFGs [pdf, 0.9MB]
  • Lecture 15: Language recap Overview; various other ideas in language [pdf, 2.6MB]

Computational Statistics

  • Lectures 16-21: Computational statistics. Conjugacy; importance sampling; likelihood sampling; Metropolis-Hastings; particle filtering [pdf, 8.1MB][code, 8KB]

Concepts and Categories

  • Lecture 22: Introduction to concepts. Overview; classical concepts; prototypes & exemplars; richer conceptual structure [pdf, 13.0MB]
  • Lecture 23-25: Classification Linear & quadratic classifiers; k-nearest neighbours; kernel methods; mixture models; Dirichlet process mixture models [pdf, 11.5MB] [code, 12KB][more code, 8KB]
  • Lecture 26: Relational concepts. What are relational concepts?; the infinite relational model [pdf, 6.6MB]
  • Lecture 27: Learning overhypotheses. Learning to learn; kinds of categories; the hierarchical Bayes approach [pdf, 4.5MB]
  • Lecture 28: Learning conceptual structure. Organising principles for knowledge; hierarchical Bayes framework using graph grammars [pdf, 2.4MB]

Decision Making

  • Lecture 29: Introduction to decision making and expected utility. What is decision-making? the classical EU approach; St Petersburg paradox [pdf, 5.5MB] [code, 8KB]
  • Lecture 30: Prospect theory, heuristics and biases. framing & non-invariant utilities; prospect functions; overview of different views [pdf, 3.7MB]
  • Lecture 31: Introduction to psychophysics. Mapping objective to subjective quantities; Fechnerian psychophysics[pdf, 4.8MB]
  • Lecture 32: Sequential sampling models. The role of time; sequential ratio probability test; overview of SSMs; applications [pdf, 5.4MB]


Auxiliary Material



Best regards.                         
                                       
Chengjun Wang

Web Mining Lab
Department of Media and Communication
City University of Hong Kong.
Room 5008, 18 Tat Hong AvenueRun Run Shaw Creative Media Centre
Kowloon. Hong Kong


wang pianpian

unread,
Mar 8, 2013, 7:27:53 PM3/8/13
to computational...@googlegroups.com
It's good! 
Thanks!


2013/3/8 chengjun wang <wang...@gmail.com>

--
You received this message because you are subscribed to the Google Groups "Computational Communication" group.
To unsubscribe from this group and stop receiving emails from it, send an email to computational-commu...@googlegroups.com.
To post to this group, send email to computational...@googlegroups.com.
Visit this group at http://groups.google.com/group/computational-communication?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.
 
 



--
Wang Pianpian
PhD Candidate 
City University of Hong Kong
Media and Communication Department
Reply all
Reply to author
Forward
0 new messages