Releases: wisedev-code/MaIN.NET
Releases · wisedev-code/MaIN.NET
MaIN.NET.0.2.2
0.2.2 release
- Added Gemini support in infer cli command
- Package upgrades
- New CLI release (support for Qwen3-8b, Qwen3-14 and Phi3.5 and Phi4)
- Multimodal and KM improvements
MaIN.NET.0.2.1
0.2.1 release
- Added Gemini (Google AI) chat integration
MaIN.NET.0.1.9
0.1.9 release
Perf improvements
- (LLamaSharp) migrate from ChatSession to Conversation and BatchExecutor
- Fixes for KernelMemory, now memory is properly disposed
- New example (conversation agent)
- CLI release (big perf improvement for infer)
- Allow to enable or disable cache for model weights
- Allow to disable buildin llama.cpp logs and MaIN notification
New version of CLI can be downloaded from this page:
https://maindoc.link/#/doc/cli
described changes are documented
MaIN.NET.0.1.8
0.1.8 release
- Updated dependencies in nuspec
MaIN.NET.0.1.7
0.1.7 release
- Allow different backends in same application/context
- Doc update & extension
- New example MultiBackendWithRedirect
MaIN.NET.0.1.6
0.1.5 release
- Enable pre processing of documents, it can greatly improve KM performance on small models
MaIN.NET.0.1.5
0.1.5 release
- Enable pre processing of documents, it can greatly improve KM performance on small models
MaIN.NET.0.1.4
0.1.4 release
- Bug fix (OpenAi integration) Memory params are now propagated for usage of openai integration
MaIN.NET.0.1.3
0.1.3 release
- Bug fix for multiple messages in chat while using memory
- Bug fix for missing usage of memory params in agent
- Bug fix for not clearning previous message in WithFiles method (now we can use it many times in context)
- Allow to configure fetch data output (fix issue with Fetch/Answer command)
MaIN.NET.0.1.2
0.1.2 release
- Cleanup of leftovers code
- Performance improvements
- Stability improvements
- Proper integration with kernel memory
- Usage of configuration parameters (MemoryParams)
Changes are reflected in docs: https://maindoc.link