[dictionaries] Removing the convention of separating ideograms with space, tokenizer can accomplish the same thing
This commit is contained in:
@@ -2,5 +2,5 @@
|
||||
gu
|
||||
동
|
||||
dong
|
||||
번 지
|
||||
번지
|
||||
beonji
|
||||
Reference in New Issue
Block a user