I would encourage you to think about potentially expanding the window. Maybe add a toggle to only do it when we want to.
The latest LLM’s are extremely powerful. I’m impressed by o1. I keep hearing very smart people beling blown away by o1 and o1 pro. And I mean scholars, economists, philosophers, who get better answers from these models than they get from their peers, who are experts in their field. And not just factual answers, but reasoned arguments. These models will keep getting smarter and smarter.
Given that I do a lot of reading on Readlang, I thing there should be a way to use LLMs not just to learn words but to engage in conversations about the text. To do that, we have to give the model access to more text. The entire paragraph or even the entire chapter. We don’t need that to understand words, but we do to get the most out of the models. Which is why it should be a toggle.
Keep the premium feature but maybe add an API option since the compute cost will start to pile up quickly.
Now, I have to copy the paragraphs and paste it in the chat. I usually do it on my phone. I read on Readlang on a Boox device. I could do this in Readlang. Yes, I know this is a language learning tool. But it’s also a reading tool. And we don’t read just to learn languages. For me it’s the opposite. I acquire languages so I can read. I read with Readlang all electronic books. Why read with any other apps? And reading involves thinking. And the models are becoming more and more of an important tool.
I think this is the best example of what I’m talking about. I had an earlier post on this, but I think reading, for some of us, will evolve like this. As the models get better, we will use them more and more to understand the text and references in the text.
I haven’t got my Daylight device yet so I’m not sure if it will be able to do this with Readlang or if I have to use their own reading app fo that. But if there is something that will get me to move away from reading with Readlang in the languages that I know good enough is not being able to do this. If you build this into Readlang, then I’m a captive audience. Why use anything else?
The thing is, everything is pretty much built. Just allow us to expand the context window. Some people might want to use voice and talk to the model since it’s easier than typing on the device.
You basically already have everything you need to expand the use of Readlang. I keep coming back to this theme. Readlang has the best experience for reading a foreign language at an advanced level, which is a specific, niche use case. But it could be so much more than that. And some use cases are low hanging fruit. They would be so easy to realize. This is one of them. I would use it for everything. Talking about German grammar, about philosophical arguments Girard makes, about events, people, places mentioned in the Papyrus book I’m reading in Spanish, or how about exploring themes in Shakespeare and talking about Harold Bloom and Girard’s interpretation of Hamlet. It’s all there. And whichever app will manage to make this interaction the most seamless and hassle free will win.
Of course, we’ll also have to get access to the latest models available. Talking with o1 is so much beter than talking to 4o.