Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
8note
3 days ago
|
parent
|
context
|
favorite
| on:
AI will make formal verification go mainstream
the question remains: is the tokenizer going to be a fundamental limit to my task? how do i know ahead of time?
worldsayshi
3 days ago
[–]
Would it limit a person getting your instructions in Chinese? Tokenisation pretty much means that the LLM is reading symbols instead of phonemes.
This makes me wonder if LLMs works better in Chinese.
reply
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: