У нас вы можете посмотреть бесплатно Exploring Human-Computer Interaction in Ontologies: A Case Study или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Smart buildings benefit from software applications that investigate energy and operational efficiency. Yet, widespread adoption of such applications is hindered by heterogeneous data representations in the building industry, such that applications built for one building are not readily reusable for another. Ontologies, otherwise known as standardised metadata schemas, have long been used as an interoperability standard to reduce reliance on proprietary naming conventions and enable application portability. Like other software systems, the success of an ontology is contingent on it's usability and perspicuity. Despite this, ontology development practices routinely exclude users, resulting in many ontologies that are incomprehensible or insufficient for the end-users' needs. Where there are documented cases of user-centred ontology development, little consideration is given to the methods used to engage users and no feedback is provided regarding how well these techniques worked, and with what limitations. As such, this project explores human-computer interaction in ontologies through a user-based case study on evaluating Brick, an open-source building ontology. The study centred around individual interviews with four building services professionals recruited across two organizations. Feedback regarding the completeness, expressivity, and usability of the ontology was gathered and analysed using thematic coding techniques. The findings of the study serve to provide both a user assessment of the Brick model and report learnings on applying human-computer interaction techniques to ontology engineering. The study revealed that a user-based evaluation provides valuable feedback and insight into domain modelling that traditional ontology evaluation cannot. In particular, the study found evidence of user involvement improving the perspicuity of the ontology and bridging the gap between the language of the ontology and that of the user. Additionally, the visual presentation of the ontology had a strong impact on the type and merit of feedback evoked, suggesting key learnings on how to best present ontologies throughout the design process.