Taurer, F., Wolling, F., Moore, J., & Michahelles, F. (2023). Smart Trash Can: Easy Collection of Photos of Organic Waste in the Home. In MUM ’23: Proceedings of the 22nd International Conference on Mobile and Ubiquitous Multimedia (pp. 571–573). Association for Computing Machinery. https://doi.org/10.1145/3626705.3631881
E193-04 - Forschungsbereich Artifact-based Computing & User Research
-
Published in:
MUM '23: Proceedings of the 22nd International Conference on Mobile and Ubiquitous Multimedia
-
ISBN:
9798400709210
-
Date (published):
3-Dec-2023
-
Event name:
22th International Conference on Mobile and Ubiquitous Multimedia (MUM 2023)
en
Event date:
3-Dec-2023 - 6-Dec-2023
-
Event place:
Vienna, Austria
-
Number of Pages:
3
-
Publisher:
Association for Computing Machinery, New York
-
Keywords:
impurity detection; organic waste; dataset; data collection; in-the-wild study; sustainability
en
Abstract:
The recycling of organic waste through composting or fermentation is an easy yet underestimated way to bind CO2. Due to a lack of awareness, society is still insufficiently separating recyclable organic waste from unsuitable residual waste. Any impurities inevitably reduce the quality of a single container and, thus, the entire load of a waste transporter, resulting in its expensive and less sustainable energy recovery through incineration. To prevent the addition of impurities and the inevitable incineration of valuable organic waste, the problem must be tackled directly at the producer’s site. In the context of a preliminary experimental study, a Smart Trash Can has been developed to automatically take photos of the waste each time it is filled. It serves as a convenient means to collect a series of photos that will contribute to a large dataset for the training of machine-learning models to assess and monitor waste quality. In this way, not only can contaminated waste be separated from pure organic waste, but also feedback, incentives, or nudges can be provided to the user to lead to better waste separation.
en
Research Areas:
Visual Computing and Human-Centered Technology: 50% Sustainable Production and Technologies: 20% Sensor Systems: 30%