Recognising human activities from sensors embedded in an environment or worn on bodies is an important and challenging research topic in pervasive computing. Existing work on activity recognition is mainly concerned with identifying single user sequential activities from well-scripted or pre-segmented sequences of sensor events. However a real-world environment often contains multiple users, with each performing activities simultaneously, in their own way and with no explicit instructions to follow. Recognising multi-user concurrent activities is challenging, but essential for designing applications for real environments. This paper presents a novel Knowledge-driven approach for Concurrent Activity Recognition (KCAR). Within KCAR, we explore the semantics underlying each sensor event and use semantic dissimilarity to segment a continuous sensor sequence into fragments, each of which corresponds to one ongoing activity. We exploit the Pyramid Match Kernel, with a strength in approximate matching on hierarchical concepts, to recognise activities of varying grained constraints from a potentially noisy sensor sequence. We conduct an empirical evaluation on a large-scale real-world data set that is collected over one year and consists of 2.8 millions of sensor events. Our results demonstrate that KCAR achieves an average recognition accuracy of 91%.