Using Milvus-Lite Now

 The easy way to complete Gen AI solutions anywhere

A revolutionary new release is now out that brings your AI databases anywhere and adds easy to developing enterprise Gen AI applications.

It is easy to get started.

pip3 install -U pymilvus

You need 2.4.3 where milvus-lite is embedded in the Python client for Milvus.

I installed this on my Mac M1 in seconds. Let’s rock.

I am going to install on some edge devices as well and follow that up with some cool examples in the coming weeks.

There are a lot of different types of Python environments, installing this in a Dockerized environment, Jupyter notebook or virtual Python environment is probably wise as there are some many libraries needed in most AI applications.

If you are okay with it, for Python 3.11, you can always do this:

pip3 install -U "pymilvus[2.4.3]" --break-system-packages

That is a last resort, I recommend going another way. Once installed you can quickly get started. You know it’s the least recommended way when they give you three pages of warnings then drop that extra tag at the bottom. So go with a virtual environment, please.

A better way is with a Python 3 Virtual Environment.

timothyspann@MacBook-Pro code % python3 -m venv milvusvenv
timothyspann@MacBook-Pro code % source milvusvenv/bin/activate
(milvusvenv) timothyspann@MacBook-Pro code % python3 -m pip install pymilvus -U
Collecting pymilvus
Using cached pymilvus-2.4.3-py3-none-any.whl.metadata (5.3 kB)
Collecting setuptools>=67 (from pymilvus)
Using cached setuptools-70.0.0-py3-none-any.whl.metadata (5.9 kB)
Collecting grpcio<=1.63.0,>=1.49.1 (from pymilvus)
Using cached grpcio-1.63.0-cp312-cp312-macosx_10_9_universal2.whl.metadata (3.2 kB)
Collecting protobuf>=3.20.0 (from pymilvus)
Using cached protobuf-5.27.0-cp38-abi3-macosx_10_9_universal2.whl.metadata (592 bytes)
Collecting environs<=9.5.0 (from pymilvus)
Using cached environs-9.5.0-py2.py3-none-any.whl.metadata (14 kB)
Collecting ujson>=2.0.0 (from pymilvus)
Using cached ujson-5.10.0-cp312-cp312-macosx_11_0_arm64.whl.metadata (9.3 kB)
Collecting pandas>=1.2.4 (from pymilvus)
Using cached pandas-2.2.2-cp312-cp312-macosx_11_0_arm64.whl.metadata (19 kB)
Collecting milvus-lite<2.5.0,>=2.4.0 (from pymilvus)
Downloading milvus_lite-2.4.6-py3-none-macosx_11_0_arm64.whl.metadata (5.6 kB)
Collecting marshmallow>=3.0.0 (from environs<=9.5.0->pymilvus)
Using cached marshmallow-3.21.2-py3-none-any.whl.metadata (7.1 kB)
Collecting python-dotenv (from environs<=9.5.0->pymilvus)
Using cached python_dotenv-1.0.1-py3-none-any.whl.metadata (23 kB)
Collecting numpy>=1.26.0 (from pandas>=1.2.4->pymilvus)
Using cached numpy-1.26.4-cp312-cp312-macosx_11_0_arm64.whl.metadata (61 kB)
Collecting python-dateutil>=2.8.2 (from pandas>=1.2.4->pymilvus)
Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl.metadata (8.4 kB)
Collecting pytz>=2020.1 (from pandas>=1.2.4->pymilvus)
Using cached pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB)
Collecting tzdata>=2022.7 (from pandas>=1.2.4->pymilvus)
Using cached tzdata-2024.1-py2.py3-none-any.whl.metadata (1.4 kB)
Collecting packaging>=17.0 (from marshmallow>=3.0.0->environs<=9.5.0->pymilvus)
Using cached packaging-24.0-py3-none-any.whl.metadata (3.2 kB)
Collecting six>=1.5 (from python-dateutil>=2.8.2->pandas>=1.2.4->pymilvus)
Using cached six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB)
Using cached pymilvus-2.4.3-py3-none-any.whl (194 kB)
Using cached environs-9.5.0-py2.py3-none-any.whl (12 kB)
Using cached grpcio-1.63.0-cp312-cp312-macosx_10_9_universal2.whl (10.1 MB)
Downloading milvus_lite-2.4.6-py3-none-macosx_11_0_arm64.whl (19.8 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 19.8/19.8 MB 39.2 MB/s eta 0:00:00
Using cached pandas-2.2.2-cp312-cp312-macosx_11_0_arm64.whl (11.3 MB)
Using cached protobuf-5.27.0-cp38-abi3-macosx_10_9_universal2.whl (412 kB)
Using cached setuptools-70.0.0-py3-none-any.whl (863 kB)
Using cached ujson-5.10.0-cp312-cp312-macosx_11_0_arm64.whl (51 kB)
Using cached marshmallow-3.21.2-py3-none-any.whl (49 kB)
Using cached numpy-1.26.4-cp312-cp312-macosx_11_0_arm64.whl (13.7 MB)
Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB)
Using cached pytz-2024.1-py2.py3-none-any.whl (505 kB)
Using cached tzdata-2024.1-py2.py3-none-any.whl (345 kB)
Using cached python_dotenv-1.0.1-py3-none-any.whl (19 kB)
Using cached packaging-24.0-py3-none-any.whl (53 kB)
Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: pytz, ujson, tzdata, six, setuptools, python-dotenv, protobuf, packaging, numpy, milvus-lite, grpcio, python-dateutil, marshmallow, pandas, environs, pymilvus
Successfully installed environs-9.5.0 grpcio-1.63.0 marshmallow-3.21.2 milvus-lite-2.4.6 numpy-1.26.4 packaging-24.0 pandas-2.2.2 protobuf-5.27.0 pymilvus-2.4.3 python-dateutil-2.9.0.post0 python-dotenv-1.0.1 pytz-2024.1 setuptools-70.0.0 six-1.16.0 tzdata-2024.1 ujson-5.10.0
(milvusvenv) timothyspann@MacBook-Pro code % python3 testlite.py
data: ["[{'id': 0, 'distance': 0.9999999403953552, 'entity': {'text': 'Artificial intelligence was founded as an academic discipline in 1956.', 'subject': 'history'}}, {'id': 2, 'distance': -0.046234432607889175, 'entity': {'text': 'Born in Maida Vale, London, Turing was raised in southern England.', 'subject': 'history'}}]"] , extra_info: {'cost': 0}
data: ["{'id': 0, 'text': 'Artificial intelligence was founded as an academic discipline in 1956.', 'subject': 'history'}", "{'id': 1, 'text': 'Alan Turing was the first person to conduct substantial research in AI.', 'subject': 'history'}", "{'id': 2, 'text': 'Born in Maida Vale, London, Turing was raised in southern England.', 'subject': 'history'}"] , extra_info: {'cost': 0}
[0, 1, 2]
(milvusvenv) timothyspann@MacBook-Pro code %

What is pretty awesome is that Milvus Lite, think lean and clean like MiNiFi, not Lite as in weak beer, is a lightweight vector database that runs within your Python application. So you just need to install the Python SDK for Milvus and you get the local vector database included. You can now run on the edge or develop applications with the same API as enterprise Milvus deployments in clusters that scale to billions of vectors. You just need to change where you point your Milvus connection.

From:

client = MilvusClient(“db/milvus_demo.db”)

To:

client = MilvusClient( uri=”http://server:19530" )

Or:

client = MilvusClient( uri=”https://server12345.serverless.gcp-us-west1.cloud.zilliz.com", token=”tokenX”)

What’s cool is nothing else changes and you can access and use all the core components for vector index and query parsing. It runs fast as a Python process. This library has already been integrated with the world’s leading AI dev stacks LlamaIndex and LangChain. There is also integration with Haystack AI and HuggingFace along with many others.

You can now setup a Retrieval-Augmented Generation (RAG) pipeline without having to setup a server or sign up for a cloud cluster before you start. When it’s time to go to production you can deploy to a cluster or Zilliz Cloud. Easy, peasy, cluster squeezy…

There’s a cool notebook you can try right now:

If you don’t have Jupyter installed yet (do that in that virtual environment).

I will be going through a number of cool examples and providing guidance at my upcoming meetups in Princeton and New York City.

If you are not in New York or Princeton area, have no worries we will stream and record to Youtube. There are also other meetups around the world including San Francisco, South Bay, Berlin and Seattle.

If you saw my recent newsletter you can see I joined Zilliz to work on the Open Source AI Database, Milvus.

I am working on a name for my new pattern of applications. It could be FLaNK-AIM or AIM (AI + Milvus) or Tim-Tam (Tim’s Towhee AI/Attu Milvus). Tim can also stand for TIMM (Pytorch Image Models).

I added a survey if you want to help, thanks! I appreciate it. You could also leave a comment if you like.

RESOURCES

I will be adding all of my favorite vector database, AI Database, LLM, Generative AI, Milvus and related items to my new website: