Sorry, you need to enable JavaScript to visit this website.

Task-aware neural architecture search

Citation Author(s):
Submitted by:
Cat Le
Last updated:
21 June 2021 - 6:00pm
Document Type:
Poster
Document Year:
2021
Event:
Presenters:
Cat P. Le
Paper Code:
MLSP-48.1
 

The design of handcrafted neural networks requires a lot of time and resources. Recent techniques in Neural Architecture Search (NAS) have proven to be competitive or better than traditional handcrafted design, although they require domain knowledge and have generally used limited search spaces. In this paper, we propose a novel framework for neural architecture search, utilizing a dictionary of models of base tasks and the similarity between the target task and the atoms of the dictionary; hence, generating an adaptive search space based on the base models of the dictionary. By introducing a gradient-based search algorithm, we can evaluate and discover the best architecture in the search space without fully training the networks. The experimental results show the efficacy of our proposed task-aware approach.

up
0 users have voted: