>> PLM_demo Hello! Hebbian Perceptual Learning Model, Release 1.1.1, 2008-02-22. Copyright (c) Alexander Petrov 1999-2008, http://alexpetrov.com This script is intended to get you started with the Hebbian perceptual learning model. It assumes you have already run PercLearn_install, which has updated the MATLAB path. The model and experiments are described in the following articles: Petrov, A., Dosher, B. & Lu, Z.-L. (2005). The dynamics of perceptual learning: An incremental reweighting model. Psychological Review, 102 (4), 715-743. Petrov, A., Dosher, B., & Lu, Z.-L. (2006) Perceptual learning without feedback in non-stationary contexts: Data and model. Vision Research, 46, 3177-3197. Abstracts and reprints available on-line: http://alexpetrov.com/pub/perclearn/ New information may be available on the PLM project web site: http://alexpetrov.com/proj/plearn/ Changing current dir to /Users/apetrov/temp/PercLearn/PLModel1 It is recommended that you open the file PLM_demo.m in the MATLAB Editor and inspect the code as you follow along. The file should now be open... Type 'return' and press Enter to continue. K>> return ******* Let us make two sample stimuli first. The stimuli are created by the function MAKE_PLE_STIM, which is part of the software that administered the experiment dubbed PLExp1 and reported in the Psych Review article cited above. As a general rule, the parameters needed to do something are bundled in some parameter structure, which is then passed to the functions that do the real job. There are functions with names ending in '_params' that generate default params from scratch. It is conventional to store the PLExp1 in the variable P: P = size: [64 64] duration: 75 Gabor_orient: [-0.1745 0.1745] noise_orient: [-0.2618 0.2618] noise_Cpeak: 0.6667 Gabor_Cpeak: [0.2300 0.1500 0.1000 0.5000] Gabor_params: [1x1 struct] noise_params: [1x1 struct] LUT_params: [1x1 struct] position: {[0 -111] [0 111]} times: [1x1 struct] mask_idx: [889x1 double] Gabor: [4-D double] Filter: [64x64x2 double] colors: [1x1 struct] viewing_dist: 720 mm_pix: 0.5655 deg_pix: 0.0450 msec_frame: 8.3333 N_frames: 9 x: [1x64 double] Let us make a left Gabor target in Context L, easiest difficulty level (target contrast): Name Size Bytes Class Attributes P 1x1 340976 struct PLM_path 1x38 76 char ans 1x46 92 char context 1x1 8 double cstr 1x44 88 char diff 1x1 8 double err 1x1 1276 struct orient 1x1 8 double stimL 64x64 32768 double target 1x1 8 double Type 'return' and press Enter to continue. K>> return Contrast that with a right target in the same Context L: Type 'return' and press Enter to continue. K>> return ******* Representational subsystem... The parameters of the representational subsystem of the model are: Idef = img_size: [64 64] FFT_size: [128 128] LUT_gray: 128 LUT_span: 127 deg_pix: 0.0450 flipud: 1 bkgr_idx: 1 D_orient: [-45 -30 -15 0 15 30 45] N_orient: 7 D_spfreq: [1 1.4142 2 2.8284 4] N_spfreq: 5 HW_orient: 15 octaves: 1 W2sigma: 0.4247 CGC_const: 0 CGC_Fpool: [5x5 double] CGC_mode: 'E1/N1' rep_scale: 1 FW_integr: 2 centers: [0 0] rep_size: [7 5 1] Before we can run PLFRONTEND, we must call MAKE_PLFRONTEND on the Idef structure. Front end: 7 orientations by 5 spatial frequencies. Orientation HWHH: 15 deg. Sp.freq. FWHH: 1 octaves. Contrast gain control via divisive normalization: E1/N1. Radial Gaussian integration weights: FW=2.0 deg.vis.ang. Integration centers: (0.0,0.0) deg.vis.ang. Global variable FRONTEND_MEMOIZATION updated. PLFRONTEND can now be used to transform any 64x64 image into a 35-element representation: Type 'return' and press Enter to continue. K>> return ******* PLM_CACHE The perceptual subsystem involves computer-intensive convolutions. It is therefore convenient to generate a large cache of representations, store them in the global variable PLM_CACHE, and reuse them over and over again. The function CACHE_INPUTS is the tool to do that. It would take several hours to generate a sufficient number of samples, however, so let us just load it from the file PLM_CACHE.mat instead: PLM_CACHE is a structure array with 12 elements, one for each stimulus type. There are 5000 samples of each kind. For example: ans = arg: [1 1 1] N: 1000 rep: [1000x35 double] Note that these 'representations' do not yet contain internal noise and have not been passed through the saturating nonlinearity of the activation function of the representaiton units. The functions AGGREG_CACHE and HEBB_STATS facilitate the statistical analysis of these caches. Type 'return' and press Enter to continue. K>> return ******* Non-stationary stimulus presentation sequences Experiments PLExp1 and PLExp2 involve presentation schedules that alternate the predominant background orientation in an alternating non-stationary manner. The functions EXPAND_BLOCK_SPEC and BLOCK_FROM_CACHE are the tools to generate such schedules. See the documentation section of these files for details, particularly Example 3 in BLOCK_FROM_CACHE.M For our present purposes, let us just load the schedule descriptors from the file SCHED.MAT: The variable sched1 defines Schedule 1, and the variable sched2 defines Schedule 2, as described in Petrov, Dosher, & Lu (2005). There are 32 blocks of 300 trials each. Each block contains 50 replications of 6 stimulus kinds: 2 Gabor orientations X 3 Gabor contrasts. The context is fixed within each block but changes in 8-block-long epochs depending on the schedule. For example, block 1in Schedule 1 has the following specification. The first 3 columns correspond to the arguments of MAKE_PLE_STIM and the .arg field in PLM_CACHE. Note that everything is in context L=1 in this case. ans = 1 1 1 50 2 1 1 50 1 2 1 50 2 2 1 50 1 3 1 50 2 3 1 50 Type 'return' and press Enter to continue. K>> return We are now in a position to generate some stimulus sequences (or 'descriptors'). The first ten trials in the sequence are as follows: ans = 2 3 1 6 663 2 2 1 4 852 1 3 1 5 825 2 2 1 4 646 1 3 1 5 609 1 1 1 1 170 1 3 1 5 993 2 1 1 2 841 1 1 1 1 281 2 2 1 4 607 The variable DESCR has 9600 trials (rows) and 5 columns, as documented in BLOCK_FROM_CACHE. The last two columns specify the cache index (1-12) and the replication index (1-5000) in the global variable PLM_CACHE, respectively. PLModel2 pulls the corresponding representation from representation on each simulated trial, avoiding any calls to PLFRONTEND. Type 'return' and press Enter to continue. K>> return ******* Running the perceptual learning model... With all this infrastructure at hand, we're finally ready to run the learning component of the model. The function PLM_PARAMS generates a default parameter structure, with parameter values reported in the Vision Research article cited above. (Use PLM_PARAMS1 for the values in the Psych Review article.) Mparams = rep_size: [7 5 1] max_act: 0.5000 rep_gain: 0.8000 out_gain: 5 rep_noise: 0.1000 out_noise: 0.1700 W_minmax: [-1 1] W_init: 0.1700 W0_seed: [35x1 double] learn_rate: 0.0016 runav_rate: 0.0200 criterion: 2.2000 fdbk_wgt: 1.8000 fdbk_fract: 1 fdbk_mode: 1 blk_size: 300 Type 'return' and press Enter to continue. K>> return PLM_HEBB2 implements the most recent version of the model (as of July 2005). It uses soft clamping to introduce feedback, and introduces it only on error. See the Vision Research article for discussion. PLM_HEBB1 implements a slightly different version used in the Psych Review article (Petrov, Dosher, & Lu, 2005). (PLM_HEBB1 is a special case of PLM_HEBB2 when Mparams.fdbk_wgt=Inf and .fdbk_mode=1.) The model can be invoked in several different ways, as documented in PLM_HEBB2.M The simplest syntax is [o,Wh] = PLM_Hebb2(descr,Mparams): ans = 1.0000 -0.4301 -0.2650 0 1.0000 0.4984 1.0000 -0.0287 -0.2444 -0.0440 1.0000 0.4999 1.0000 -0.3712 -0.4517 -0.0871 0 -0.3712 1.0000 -0.0549 -0.2115 -0.1294 1.0000 0.4998 1.0000 -0.2268 -0.4565 -0.1708 0 -0.2268 2.0000 0.3752 -0.3424 -0.2114 -1.0000 -0.4991 1.0000 -0.0901 -0.4421 -0.1631 0 -0.0901 2.0000 0.2865 -0.0728 -0.2039 0 0.2865 1.0000 -0.2570 -0.3829 -0.1558 0 -0.2570 2.0000 0.0151 -0.2744 -0.1967 0 0.0151 The leftmost column in the matrix O is the behavioral response of the model (1="Left", 2="Right"). The other columns keep track of various internal quantities. The printout above lists the first 10 trials of a 9600-trial run. Type 'return' and press Enter to continue. K>> return The variable Wh keeps the weight history of the run. See PLM_HEBB2.M for details. Type 'return' and press Enter to continue. K>> return ******* Calculate d' learning curves The function PLM_STATS converts the long sequence of binary responses into z-transformed probabilities of correct response for congruent and incongruent stimuli in each block. This is analogous to the analysis of the behavioral data (see PLEDATA_BY_SBJ and PLE2_STATS). PLM_STATS can analyze a whole batch of runs at once, which are then averaged by PLM_SUMMARY. In our case we have just a single run to analyze: sumstats = N_runs: 1 zP_congr: [32x3 double] zP_incon: [32x3 double] dprime: [32x3 double] accZP: [0.8964 0.8791 1.0258 1.3088 0.4804 -0.1764] mnA_congr: [32x3 double] mnA_incon: [32x3 double] Aprime: [32x3 double] accA: [0.1681 0.1719 0.1849 0.2430 0.0960 -0.0337] mnI_congr: [32x3 double] mnI_incon: [32x3 double] Iprime: [32x3 double] accI: [0.4683 0.4837 0.4939 -0.0507 -0.2116 -0.3370] contextc: [1x20 double] contexti: [1x20 double] otherc: [1x20 double] otheri: [1x20 double] Type 'return' and press Enter to continue. K>> return ******* Plot the model performance against PLExp1 data. The file HUMAN_ZCONGR.MAT contains the group averages of the z-probability profiles of the first perceptual learning experiment (PLExp1, with feedback). See PLEDATA_BY_SBJ for details. PLOT_PLM_FIT can then be used to plot the model data against the human data from PLExp1. (Use PLOT_PLM_FIT2 with ../PLExp2/modelfits/HUMAN_ZCONGR.mat for the no-feedback condition.) Type 'return' and press Enter to continue. K>> return ******* Where to go from here. This illustrates the main functions involved in running the multichannel perceptual learning model with default parameters. There are specialized utilities for parameter search -- see in particular the functions PLM_ZCONGR2, MAKE_CHEV, PLM_SEARCH_PARAMS, and the Paramsearch Toolbox. The README files and the numerous .txt notes in various directories under PLModel1 contain transcripts of most simulations and data fits reported in the two journal articles listed in the beginning of this tutorial. The notes in PLModel1/newsimul/*.txt are the most recent ones. There is also quite a bit of documentation in the preambles of the .m functions that implement the model. Thank you for your interest in this work. If you have any questions, please contact Alex Petrov, http://alexpetrov.com Check the PLM project web site for recent updates: http://alexpetrov.com/proj/plearn/ End of PLM_DEMO. Last updated 2008-02-22. >> >> >> whos Name Size Bytes Class Attributes HUMAN_ZCONGR 6x32 1536 double global Hparams 1x1 11268 struct Idef 1x1 3384 struct Mparams 1x1 2408 struct P 1x1 340976 struct PLM_CACHE 12x1 3362736 struct global PLM_path 1x38 76 char Wh 37x9601 2841896 double ans 10x6 480 double bnd 1x6 48 double context 1x1 8 double cstr 1x44 88 char descr 9600x5 384000 double diff 1x1 8 double err 1x1 1276 struct f 1x5 40 double foo 37x961 284456 double k 1x1 8 double l 1x5 380 cell o 9600x6 460800 double or 1x7 56 double orient 1x1 8 double repL 1x35 280 double repR 1x35 280 double sched1 32x1 8064 cell sched2 32x1 8064 cell stats 1x1 9812 struct stimL 64x64 32768 double stimR 64x64 32768 double sumstats 1x1 9812 struct target 1x1 8 double >>