Today marks a memorable milestone for HackerGPT. We released the Lite version that is self-sufficient. We no longer rely on OpenAI API to answer questions for our users.
We have great respect for OpenAI and will always view them as trailblazers of the revolutionary LLM technology.
Like all good researchers and engineers, we do not, however, stay bound by other organizations' rules and policies and we did not feel comfortable continuing sharing user prompts with OpenAI. This required a leap of faith, a considerable step toward independence and maturity.
ALL prompts for free users are now being processed by servers we are in physical possession and control of. They don’t get stored “in the cloud” or on a 3-rd party server, they don’t get sent to OpenAI running on Azure or AWS EC2 instance running ollama or any 3rd party whatsoever. It took almost a year to achieve this and we are truly proud.
We would like to encourage you to reach out and share how we can keep our service independent and decentralized. While we are in full control of how the information we get from our users gets stored and processed, it may not be enough, the next step is decentralization of processing without compromising privacy.
If you have ideas or GPU resources you could effectively contribute to make HackerGPT more accessible, feel free to reach out to us at support@hackergpt.app