Semantic Web
A Personal Research Agent for Semantic Knowledge Management of Scientific Literature
Submitted by bahar on Thu, 2018-06-28 10:31The LODeXporter: Flexible Generation of Linked Open Data Triples from NLP Frameworks for Automatic Knowledge Base Construction
Submitted by bahar on Thu, 2018-06-28 10:25Semantic representation of scientific literature: bringing claims, contributions and named entities onto the Linked Open Data cloud
Submitted by bahar on Thu, 2016-01-07 17:09SAVE-SD 2015 Publication: Supplementary Material
This page provides supplementary material for our publication in the SAVE-SD 2015 workshop on Semantics, Analytics, Visualisation: Enhancing Scholarly Data. We have published our populated knowledge base from the experiments described in the paper. In order to reproduce the results in the "Application" section, you can execute the queries by clicking on the link to the full page below.
Tutorial: Adding Natural Language Processing Support to your (Semantic) MediaWiki
Wikis have become powerful knowledge management platforms, offering high customizability while remaining relatively easy to deploy and use. With a majority of content in natural language, wikis can greatly benefit from automated text analysis techniques. Natural Language Processing is a branch of computer science that employs various Artificial Intelligence (AI) techniques to process content written in natural language. NLP-enhanced wikis can support users in finding, developing and organizing knowledge contained inside the wiki repository. Rather than relying on external NLP applications, we developed an approach that brings NLP as an integrated feature to wiki systems, thereby creating new human/AI collaboration patterns, where human users work together with automated "intelligent assistants" on developing, structuring and improving wiki content. This is achieved with our open source Wiki-NLP integration, a Semantic Assistants add-on that allows to incorporate NLP services into the MediaWiki environment, thereby enabling wiki users to benefit from modern text mining techniques.
This tutorial has two main parts: In the first part, we will present an introduction into NLP and text mining, as well as related frameworks, in particular the General Architecture for Text Engineering and the Semantic Assistants framework. Building on the foundations covered in the first part, we will then look into the Wiki-NLP integration and show how you can add arbitrary text processing services to your (Semantic) MediaWiki instance with minimal effort. Throughout the tutorial, we illustrate the application of NLP in wikis with a number of applications examples from various domains we developed in our research within the last decade, such as cultural heritage data management, collaborative software requirements engineering, and biomedical knowledge management. These showcases of the Wiki-NLP integration highlight a number of integration patterns that will help you to adopt this technology for your own domain.
Supporting Researchers with a Semantic Literature Management Wiki
Submitted by bahar on Tue, 2014-08-05 13:14Proceedings of the 4th Canadian Semantic Web Symposium (CSWS 2013) now at CEUR
The complete proceedings of the 4th Canadian Semantic Web Symposium (CSWS 2013) are now available on the CEUR-WS.org website as Volume 1054.
The complete volume contains abstracts for the two invited talks, two full papers, two short papers, five early career track papers, and four systems papers. Individual papers can be downloaded from the CEUR-WS.org site, where you can also find a BibTeX file with all references.
- Login to post comments
Semantic Content Processing in Web Portals
Submitted by rene on Sun, 2013-10-06 16:58Personalized Semantic Assistance for the Curation of Biochemical Literature
Submitted by mj on Mon, 2012-10-15 13:27Natural Language Processing for Semantic Assistance in Web Portals
Submitted by mj on Mon, 2012-10-15 13:19