Research Output
Towards optimisers that `Keep Learning'
  We consider optimisation in the context of the need to apply an optimiser to a continual stream of instances from one or more domains, and consider how such a system might 'keep learning': by drawing on past experience to improve performance and learning how to both predict and react to instance and/or domain drift.

  • Date:

    24 July 2023

  • Publication Status:

    Published

  • Publisher

    ACM

  • DOI:

    10.1145/3583133.3596344

  • Funders:

    Engineering and Physical Sciences Research Council

Citation

Hart, E., Miguel, I., Stone, C., & Renau, Q. (2023, July). Towards optimisers that `Keep Learning'. Presented at Companion Conference on Genetic and Evolutionary Computation, Lisbon, Portugal

Authors

Keywords

Optimisation, continual learning, transfer-learning

Monthly Views:

Available Documents