Sorry, you need to enable JavaScript to visit this website.

Learning theory and algorithms (MLR-LEAR)

Serious Games and ML for Detecting MCI


Our work has focused on detecting Mild Cognitive Impairment (MCI) by developing Serious Games (SG) on mobile devices, distinct from games marketed as 'brain training' which claim to maintain mental acuity. One game, WarCAT, captures players' moves during the game to infer processes of strategy recognition, learning, and memory. The purpose of our game is to use the generated game-play data combined with machine learning (ML) to help detect MCI. MCI is difficult to detect for several reasons.

Paper Details

Authors:
Mahmood Aljumaili, Robert D McLeod, Marcia Friesen
Submitted On:
12 November 2019 - 12:45am
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

GlobalSIP 2019 - Serious Games and ML for Detecting MCI.pdf

(0)

Subscribe

[1] Mahmood Aljumaili, Robert D McLeod, Marcia Friesen, "Serious Games and ML for Detecting MCI", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4949. Accessed: Nov. 13, 2019.
@article{4949-19,
url = {http://sigport.org/4949},
author = {Mahmood Aljumaili; Robert D McLeod; Marcia Friesen },
publisher = {IEEE SigPort},
title = {Serious Games and ML for Detecting MCI},
year = {2019} }
TY - EJOUR
T1 - Serious Games and ML for Detecting MCI
AU - Mahmood Aljumaili; Robert D McLeod; Marcia Friesen
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4949
ER -
Mahmood Aljumaili, Robert D McLeod, Marcia Friesen. (2019). Serious Games and ML for Detecting MCI. IEEE SigPort. http://sigport.org/4949
Mahmood Aljumaili, Robert D McLeod, Marcia Friesen, 2019. Serious Games and ML for Detecting MCI. Available at: http://sigport.org/4949.
Mahmood Aljumaili, Robert D McLeod, Marcia Friesen. (2019). "Serious Games and ML for Detecting MCI." Web.
1. Mahmood Aljumaili, Robert D McLeod, Marcia Friesen. Serious Games and ML for Detecting MCI [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4949

Wave Physics Informed Dictionary Learning in One Dimension

Paper Details

Authors:
Harsha Vardhan Tetali, K. Supreet Alguri, Joel B. Harley
Submitted On:
25 October 2019 - 1:50pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:

Document Files

HarshaMLSP.pdf

(15)

Subscribe

[1] Harsha Vardhan Tetali, K. Supreet Alguri, Joel B. Harley, "Wave Physics Informed Dictionary Learning in One Dimension", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4895. Accessed: Nov. 13, 2019.
@article{4895-19,
url = {http://sigport.org/4895},
author = {Harsha Vardhan Tetali; K. Supreet Alguri; Joel B. Harley },
publisher = {IEEE SigPort},
title = {Wave Physics Informed Dictionary Learning in One Dimension},
year = {2019} }
TY - EJOUR
T1 - Wave Physics Informed Dictionary Learning in One Dimension
AU - Harsha Vardhan Tetali; K. Supreet Alguri; Joel B. Harley
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4895
ER -
Harsha Vardhan Tetali, K. Supreet Alguri, Joel B. Harley. (2019). Wave Physics Informed Dictionary Learning in One Dimension. IEEE SigPort. http://sigport.org/4895
Harsha Vardhan Tetali, K. Supreet Alguri, Joel B. Harley, 2019. Wave Physics Informed Dictionary Learning in One Dimension. Available at: http://sigport.org/4895.
Harsha Vardhan Tetali, K. Supreet Alguri, Joel B. Harley. (2019). "Wave Physics Informed Dictionary Learning in One Dimension." Web.
1. Harsha Vardhan Tetali, K. Supreet Alguri, Joel B. Harley. Wave Physics Informed Dictionary Learning in One Dimension [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4895

Generic Bounds on the Maximum Deviations in Sequential/Sequence Prediction (and the Implications in Recursive Algorithms and Learning/Generalization)


In this paper, we derive generic bounds on the maximum deviations in prediction errors for sequential prediction via an information-theoretic approach. The fundamental bounds are shown to depend only on the conditional entropy of the data point to be predicted given the previous data points. In the asymptotic case, the bounds are achieved if and only if the prediction error is white and uniformly distributed.

Paper Details

Authors:
Song Fang, Quanyan Zhu
Submitted On:
24 October 2019 - 4:45pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

postertemplate.pdf

(17)

Subscribe

[1] Song Fang, Quanyan Zhu, "Generic Bounds on the Maximum Deviations in Sequential/Sequence Prediction (and the Implications in Recursive Algorithms and Learning/Generalization)", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4890. Accessed: Nov. 13, 2019.
@article{4890-19,
url = {http://sigport.org/4890},
author = {Song Fang; Quanyan Zhu },
publisher = {IEEE SigPort},
title = {Generic Bounds on the Maximum Deviations in Sequential/Sequence Prediction (and the Implications in Recursive Algorithms and Learning/Generalization)},
year = {2019} }
TY - EJOUR
T1 - Generic Bounds on the Maximum Deviations in Sequential/Sequence Prediction (and the Implications in Recursive Algorithms and Learning/Generalization)
AU - Song Fang; Quanyan Zhu
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4890
ER -
Song Fang, Quanyan Zhu. (2019). Generic Bounds on the Maximum Deviations in Sequential/Sequence Prediction (and the Implications in Recursive Algorithms and Learning/Generalization). IEEE SigPort. http://sigport.org/4890
Song Fang, Quanyan Zhu, 2019. Generic Bounds on the Maximum Deviations in Sequential/Sequence Prediction (and the Implications in Recursive Algorithms and Learning/Generalization). Available at: http://sigport.org/4890.
Song Fang, Quanyan Zhu. (2019). "Generic Bounds on the Maximum Deviations in Sequential/Sequence Prediction (and the Implications in Recursive Algorithms and Learning/Generalization)." Web.
1. Song Fang, Quanyan Zhu. Generic Bounds on the Maximum Deviations in Sequential/Sequence Prediction (and the Implications in Recursive Algorithms and Learning/Generalization) [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4890

Minimax Active Learning via Minimal Model Capacity


Active learning is a form of machine learning which combines supervised learning and feedback to minimize the training set size, subject to low generalization errors. Since direct optimization of the generalization error is difficult, many heuristics have been developed which lack a firm theoretical foundation. In this paper, a new information theoretic criterion is proposed based on a minimax log-loss regret formulation of the active learning problem. In the first part of this paper, a Redundancy Capacity theorem for active learning is derived along with an optimal learner.

Paper Details

Authors:
Meir Feder
Submitted On:
16 October 2019 - 4:02pm
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

MLSP_2019_Minimax_Active_Learning_via_Minimal_Model_Capacity.pdf

(29)

Subscribe

[1] Meir Feder , "Minimax Active Learning via Minimal Model Capacity", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4876. Accessed: Nov. 13, 2019.
@article{4876-19,
url = {http://sigport.org/4876},
author = {Meir Feder },
publisher = {IEEE SigPort},
title = {Minimax Active Learning via Minimal Model Capacity},
year = {2019} }
TY - EJOUR
T1 - Minimax Active Learning via Minimal Model Capacity
AU - Meir Feder
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4876
ER -
Meir Feder . (2019). Minimax Active Learning via Minimal Model Capacity. IEEE SigPort. http://sigport.org/4876
Meir Feder , 2019. Minimax Active Learning via Minimal Model Capacity. Available at: http://sigport.org/4876.
Meir Feder . (2019). "Minimax Active Learning via Minimal Model Capacity." Web.
1. Meir Feder . Minimax Active Learning via Minimal Model Capacity [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4876

Exact Incremental and Decremental Learning for LS-SVM


In this paper, we present a novel incremental and decremental learning method for the least-squares support vector machine (LS-SVM). The goal is to adapt a pre-trained model to changes in the training dataset, without retraining the model on all the data, where the changes can include addition and deletion of data samples. We propose a provably exact method where the updated model is exactly the same as a model trained from scratch using the entire (updated) training dataset.

Paper Details

Authors:
Wei-Han Lee, Bong Jun Ko, Shiqiang Wang, Changchang Liu, Kin K. Leung
Submitted On:
23 September 2019 - 12:50pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

ICIP 2019 talk.pdf

(14)

Subscribe

[1] Wei-Han Lee, Bong Jun Ko, Shiqiang Wang, Changchang Liu, Kin K. Leung, "Exact Incremental and Decremental Learning for LS-SVM", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4822. Accessed: Nov. 13, 2019.
@article{4822-19,
url = {http://sigport.org/4822},
author = {Wei-Han Lee; Bong Jun Ko; Shiqiang Wang; Changchang Liu; Kin K. Leung },
publisher = {IEEE SigPort},
title = {Exact Incremental and Decremental Learning for LS-SVM},
year = {2019} }
TY - EJOUR
T1 - Exact Incremental and Decremental Learning for LS-SVM
AU - Wei-Han Lee; Bong Jun Ko; Shiqiang Wang; Changchang Liu; Kin K. Leung
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4822
ER -
Wei-Han Lee, Bong Jun Ko, Shiqiang Wang, Changchang Liu, Kin K. Leung. (2019). Exact Incremental and Decremental Learning for LS-SVM. IEEE SigPort. http://sigport.org/4822
Wei-Han Lee, Bong Jun Ko, Shiqiang Wang, Changchang Liu, Kin K. Leung, 2019. Exact Incremental and Decremental Learning for LS-SVM. Available at: http://sigport.org/4822.
Wei-Han Lee, Bong Jun Ko, Shiqiang Wang, Changchang Liu, Kin K. Leung. (2019). "Exact Incremental and Decremental Learning for LS-SVM." Web.
1. Wei-Han Lee, Bong Jun Ko, Shiqiang Wang, Changchang Liu, Kin K. Leung. Exact Incremental and Decremental Learning for LS-SVM [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4822

Statistical rank selection for incomplete low-rank matrices

Paper Details

Authors:
Rui Zhang, Alexander Shapiro, Yao Xie
Submitted On:
15 May 2019 - 7:09pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:

Document Files

ICASSP2019.pdf

(42)

Subscribe

[1] Rui Zhang, Alexander Shapiro, Yao Xie, "Statistical rank selection for incomplete low-rank matrices", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4534. Accessed: Nov. 13, 2019.
@article{4534-19,
url = {http://sigport.org/4534},
author = {Rui Zhang; Alexander Shapiro; Yao Xie },
publisher = {IEEE SigPort},
title = {Statistical rank selection for incomplete low-rank matrices},
year = {2019} }
TY - EJOUR
T1 - Statistical rank selection for incomplete low-rank matrices
AU - Rui Zhang; Alexander Shapiro; Yao Xie
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4534
ER -
Rui Zhang, Alexander Shapiro, Yao Xie. (2019). Statistical rank selection for incomplete low-rank matrices. IEEE SigPort. http://sigport.org/4534
Rui Zhang, Alexander Shapiro, Yao Xie, 2019. Statistical rank selection for incomplete low-rank matrices. Available at: http://sigport.org/4534.
Rui Zhang, Alexander Shapiro, Yao Xie. (2019). "Statistical rank selection for incomplete low-rank matrices." Web.
1. Rui Zhang, Alexander Shapiro, Yao Xie. Statistical rank selection for incomplete low-rank matrices [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4534

Statistical rank selection for incomplete low-rank matrices

Paper Details

Authors:
Rui Zhang, Alexander Shapiro, Yao Xie
Submitted On:
15 May 2019 - 7:09pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:

Document Files

ICASSP2019.pdf

(42)

Subscribe

[1] Rui Zhang, Alexander Shapiro, Yao Xie, "Statistical rank selection for incomplete low-rank matrices", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4533. Accessed: Nov. 13, 2019.
@article{4533-19,
url = {http://sigport.org/4533},
author = {Rui Zhang; Alexander Shapiro; Yao Xie },
publisher = {IEEE SigPort},
title = {Statistical rank selection for incomplete low-rank matrices},
year = {2019} }
TY - EJOUR
T1 - Statistical rank selection for incomplete low-rank matrices
AU - Rui Zhang; Alexander Shapiro; Yao Xie
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4533
ER -
Rui Zhang, Alexander Shapiro, Yao Xie. (2019). Statistical rank selection for incomplete low-rank matrices. IEEE SigPort. http://sigport.org/4533
Rui Zhang, Alexander Shapiro, Yao Xie, 2019. Statistical rank selection for incomplete low-rank matrices. Available at: http://sigport.org/4533.
Rui Zhang, Alexander Shapiro, Yao Xie. (2019). "Statistical rank selection for incomplete low-rank matrices." Web.
1. Rui Zhang, Alexander Shapiro, Yao Xie. Statistical rank selection for incomplete low-rank matrices [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4533

A Characterization of Stochastic Mirror Descent Algorithms and Their Convergence Properties


Stochastic mirror descent (SMD) algorithms have recently garnered a great deal of attention in optimization, signal processing, and machine learning. They are similar to stochastic gradient descent (SGD), in that they perform updates along the negative gradient of an instantaneous (or stochastically chosen) loss function. However, rather than update the parameter (or weight) vector directly, they update it in a "mirrored" domain whose transformation is given by the gradient of a strictly convex differentiable potential function.

Paper Details

Authors:
Navid Azizan, Babak Hassibi
Submitted On:
13 May 2019 - 8:33pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

ICASSP-SMD-Poster.pdf

(46)

Subscribe

[1] Navid Azizan, Babak Hassibi, "A Characterization of Stochastic Mirror Descent Algorithms and Their Convergence Properties", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4498. Accessed: Nov. 13, 2019.
@article{4498-19,
url = {http://sigport.org/4498},
author = {Navid Azizan; Babak Hassibi },
publisher = {IEEE SigPort},
title = {A Characterization of Stochastic Mirror Descent Algorithms and Their Convergence Properties},
year = {2019} }
TY - EJOUR
T1 - A Characterization of Stochastic Mirror Descent Algorithms and Their Convergence Properties
AU - Navid Azizan; Babak Hassibi
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4498
ER -
Navid Azizan, Babak Hassibi. (2019). A Characterization of Stochastic Mirror Descent Algorithms and Their Convergence Properties. IEEE SigPort. http://sigport.org/4498
Navid Azizan, Babak Hassibi, 2019. A Characterization of Stochastic Mirror Descent Algorithms and Their Convergence Properties. Available at: http://sigport.org/4498.
Navid Azizan, Babak Hassibi. (2019). "A Characterization of Stochastic Mirror Descent Algorithms and Their Convergence Properties." Web.
1. Navid Azizan, Babak Hassibi. A Characterization of Stochastic Mirror Descent Algorithms and Their Convergence Properties [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4498

A Fast Method of Computing Persistent Homology of Time Series Data

Paper Details

Authors:
Kazuyuki Aihara
Submitted On:
10 May 2019 - 10:46am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

ICASSP2019_poster_tsuji_20190508.pdf

(36)

Subscribe

[1] Kazuyuki Aihara, "A Fast Method of Computing Persistent Homology of Time Series Data", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4348. Accessed: Nov. 13, 2019.
@article{4348-19,
url = {http://sigport.org/4348},
author = {Kazuyuki Aihara },
publisher = {IEEE SigPort},
title = {A Fast Method of Computing Persistent Homology of Time Series Data},
year = {2019} }
TY - EJOUR
T1 - A Fast Method of Computing Persistent Homology of Time Series Data
AU - Kazuyuki Aihara
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4348
ER -
Kazuyuki Aihara. (2019). A Fast Method of Computing Persistent Homology of Time Series Data. IEEE SigPort. http://sigport.org/4348
Kazuyuki Aihara, 2019. A Fast Method of Computing Persistent Homology of Time Series Data. Available at: http://sigport.org/4348.
Kazuyuki Aihara. (2019). "A Fast Method of Computing Persistent Homology of Time Series Data." Web.
1. Kazuyuki Aihara. A Fast Method of Computing Persistent Homology of Time Series Data [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4348

Pairwise Approximate K-SVD


Pairwise, or separable, dictionaries are suited for the sparse representation of 2D signals in their original form, without vectorization. They are equivalent with enforcing a Kronecker structure on a standard dictionary for 1D signals. We present a dictionary learning algorithm, in the coordinate descent style of Approximate K-SVD, for such dictionaries. The algorithm has the benefit of extremely low complexity, clearly lower than that of existing algorithms.

Paper Details

Authors:
Paul Irofti, Bogdan Dumitrescu
Submitted On:
10 May 2019 - 3:58am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

pair-ksvd-poster.pdf

(30)

Subscribe

[1] Paul Irofti, Bogdan Dumitrescu, "Pairwise Approximate K-SVD", IEEE SigPort, 2019. [Online]. Available: http://sigport.org/4284. Accessed: Nov. 13, 2019.
@article{4284-19,
url = {http://sigport.org/4284},
author = {Paul Irofti; Bogdan Dumitrescu },
publisher = {IEEE SigPort},
title = {Pairwise Approximate K-SVD},
year = {2019} }
TY - EJOUR
T1 - Pairwise Approximate K-SVD
AU - Paul Irofti; Bogdan Dumitrescu
PY - 2019
PB - IEEE SigPort
UR - http://sigport.org/4284
ER -
Paul Irofti, Bogdan Dumitrescu. (2019). Pairwise Approximate K-SVD. IEEE SigPort. http://sigport.org/4284
Paul Irofti, Bogdan Dumitrescu, 2019. Pairwise Approximate K-SVD. Available at: http://sigport.org/4284.
Paul Irofti, Bogdan Dumitrescu. (2019). "Pairwise Approximate K-SVD." Web.
1. Paul Irofti, Bogdan Dumitrescu. Pairwise Approximate K-SVD [Internet]. IEEE SigPort; 2019. Available from : http://sigport.org/4284

Pages