Constraint-aware recommendation of complex items

Mathias Uta, Alexander Felfernig, Denis Helic

Research output: Contribution to journalArticleResearchpeer-review

Abstract

In contrast to basic items such as movies, books, and songs, configurable items consist of individual subcomponents that can be combined following a predefined set of constraints. Due to the increasing size and complexity of configurable items (e.g., cars and software), a simple enumeration of all possible configurations in terms of a product catalog is not possible. Configuration systems try to identify a solution (configuration) that takes into account both, the preferences of the user and a set of constraints that defines in which way individual subcomponents are allowed to be combined. Due to time limitations, cognitive overloads, and missing domain knowledge, configurator users are in many cases not able to completely specify their preferences with regard to all relevant component properties. As a consequence, recommendation technologies need to be integrated into configurators that are able to predict the relevance of individual components for the current user. In this paper, we show how the determination of configurations can be supported by neural network based recommendation. This approach helps to predict user-relevant item properties using historical interaction data. In this context, we introduce a semantic regularization approach that helps to take into account configuration constraints within the scope of neural network learning. Furthermore, we demonstrate the applicability of our approach on the basis of an evaluation in an industrial configuration scenario (high-voltage switchgear configuration).
Original languageEnglish
JournalCEUR Workshop Proceedings
Volume2960
Publication statusPublished - 1 Sept 2021

Keywords

  • Knowledge representation and reasoning
  • Neural networks
  • Recommender systems

Fingerprint

Dive into the research topics of 'Constraint-aware recommendation of complex items'. Together they form a unique fingerprint.

Cite this