Missing 2 required positional arguments: 'milvus' and 'triton'

Error message in Chain-Server , I converted to yaml file  using kompose utility 
and  deployed on OpenShift 4.17 but getting below error. 

Here is the yaml file 

---
apiVersion: apps.openshift.io/v1
kind: DeploymentConfig
metadata:
  labels:
    io.kompose.service: chain-server
  name: chain-server
spec:
  replicas: 1
  selector:
    io.kompose.service: chain-server
  strategy:
    type: Recreate
  template:
    metadata:
      labels:
        io.kompose.service: chain-server
    spec:
      serviceAccountName: default
      initContainers:
      - command:
        - sh
        - -c
        - mkdir /tmp-data; touch /tmp-data/test.txt; mkdir /.cache ; touch /.cache/test.txt
        image: busybox:1.28
        imagePullPolicy: IfNotPresent
        name: init-service
        securityContext:
          privileged: true
          runAsGroup: 0
          runAsUser: 0
      containers:
        - args:
            - --port
            - "8081"
            - --host
            - 0.0.0.0
          env:
            - name: APP_EMBEDDINGS_MODELENGINE
              value: nvidia-ai-endpoints
            - name: APP_EMBEDDINGS_MODELNAME
              value: nvidia/nv-embedqa-e5-v5
            - name: APP_EMBEDDINGS_SERVERURL
              value: nemollm-embedding:8000
            - name: APP_LLM_MODELENGINE
              value: nvidia-ai-endpoints
            - name: APP_LLM_MODELNAME
              value: '"meta/llama3-70b-instruct"'
            - name: APP_LLM_SERVERURL
              value: nemollm-inference:8000
            - name: APP_RETRIEVER_SCORETHRESHOLD
              value: "0.25"
            - name: APP_RETRIEVER_TOPK
              value: "4"
            - name: APP_TEXTSPLITTER_CHUNKOVERLAP
              value: "200"
            - name: APP_TEXTSPLITTER_CHUNKSIZE
              value: "506"
            - name: APP_TEXTSPLITTER_MODELNAME
              value: Snowflake/snowflake-arctic-embed-l
            - name: APP_VECTORSTORE_NAME
              value: milvus
            - name: APP_VECTORSTORE_URL
              value: http://milvus:19530
            - name: COLLECTION_NAME
              value: nvidia_api_catalog
            - name: ENABLE_TRACING
              value: "false"
            - name: EXAMPLE_PATH
              value: basic_rag/langchain
            - name: LOGLEVEL
              value: INFO
            - name: NVIDIA_API_KEY
              value: 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
            - name: OTEL_EXPORTER_OTLP_ENDPOINT
              value: http://otel-collector:4317
            - name: OTEL_EXPORTER_OTLP_PROTOCOL
              value: grpc
          image: ' '
          name: chain-server
          ports:
            - containerPort: 8081
              protocol: TCP
          volumeMounts:
            - mountPath: /prompt.yaml
              name: chain-server-cm0
              subPath: prompt.yaml
      restartPolicy: Always
      volumes:
        - configMap:
            items:
              - key: prompt.yaml
                path: prompt.yaml
            name: chain-server-cm0
          name: chain-server-cm0
  test: false
  triggers:
    - type: ConfigChange
    - imageChangeParams:
        automatic: true
        containerNames:
          - chain-server
        from:
          kind: ImageStreamTag
          name: chain-server:latest
      type: ImageChange
---
apiVersion: image.openshift.io/v1
kind: ImageStream
metadata:
  labels:
    io.kompose.service: chain-server
  name: chain-server
spec:
  lookupPolicy:
    local: false
  tags:
    - from:
        kind: DockerImage
        name: nvcr.io/nvidia/aiworkflows/chain-server:latest
      name: latest
      referencePolicy:
        type: ""
---
 
mann@NWWGGH-A8140B2D mmbasic-rag % oc logs pod/chain-server-7-lb5bk
Defaulted container "chain-server" out of: chain-server, init-service (init)
/usr/local/lib/python3.10/dist-packages/langchain/__init__.py:40: UserWarning: Importing BasePromptTemplate from langchain root module is no longer supported.
  warnings.warn(
/usr/local/lib/python3.10/dist-packages/langchain/__init__.py:40: UserWarning: Importing PromptTemplate from langchain root module is no longer supported.
  warnings.warn(
Downloading .gitattributes: 100%|██████████| 1.48k/1.48k [00:00<00:00, 13.6MB/s]
Downloading 1_Pooling/config.json: 100%|██████████| 201/201 [00:00<00:00, 2.14MB/s]
Downloading README.md: 100%|██████████| 67.8k/67.8k [00:00<00:00, 5.11MB/s]
Downloading config.json: 100%|██████████| 616/616 [00:00<00:00, 7.14MB/s]
Downloading handler.py: 100%|██████████| 1.12k/1.12k [00:00<00:00, 14.3MB/s]
Downloading model.safetensors: 100%|██████████| 1.34G/1.34G [00:05<00:00, 233MB/s]
Downloading pytorch_model.bin: 100%|██████████| 1.34G/1.34G [00:05<00:00, 225MB/s]
Downloading (…)nce_bert_config.json: 100%|██████████| 57.0/57.0 [00:00<00:00, 454kB/s]
Downloading (…)cial_tokens_map.json: 100%|██████████| 125/125 [00:00<00:00, 1.48MB/s]
Downloading tokenizer.json: 100%|██████████| 711k/711k [00:00<00:00, 13.8MB/s]
Downloading tokenizer_config.json: 100%|██████████| 314/314 [00:00<00:00, 3.73MB/s]
Downloading vocab.txt: 100%|██████████| 232k/232k [00:00<00:00, 32.5MB/s]
Downloading modules.json: 100%|██████████| 387/387 [00:00<00:00, 4.08MB/s]
Traceback (most recent call last):
  File "/usr/local/bin/uvicorn", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/uvicorn/main.py", line 416, in main
    run(
  File "/usr/local/lib/python3.10/dist-packages/uvicorn/main.py", line 587, in run
    server.run()
  File "/usr/local/lib/python3.10/dist-packages/uvicorn/server.py", line 61, in run
    return asyncio.run(self.serve(sockets=sockets))
  File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
  File "/usr/local/lib/python3.10/dist-packages/uvicorn/server.py", line 68, in serve
    config.load()
  File "/usr/local/lib/python3.10/dist-packages/uvicorn/config.py", line 467, in load
    self.loaded_app = import_from_string(self.app)
  File "/usr/local/lib/python3.10/dist-packages/uvicorn/importer.py", line 21, in import_from_string
    module = importlib.import_module(module_str)
  File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/opt/chain_server/server.py", line 19, in <module>
    chains.set_service_context()
  File "/opt/chain_server/chains.py", line 148, in set_service_context
    llm=get_llm(), embed_model=get_embedding_model()
  File "/opt/chain_server/chains.py", line 102, in get_llm
    settings = get_config()
  File "/opt/chain_server/chains.py", line 93, in get_config
    config = configuration.AppConfig.from_file(config_file)
  File "/opt/chain_server/configuration_wizard.py", line 293, in from_file
    config = cls.from_dict({})
  File "/opt/chain_server/configuration_wizard.py", line 241, in from_dict
    return fromdict(cls, data)  # type: ignore[no-any-return] # dataclass-wizard doesn't provide stubs
  File "/usr/local/lib/python3.10/dist-packages/dataclass_wizard/loaders.py", line 536, in fromdict
    return load(d)
  File "/usr/local/lib/python3.10/dist-packages/dataclass_wizard/loaders.py", line 662, in cls_fromdict
    raise MissingFields(
dataclass_wizard.errors.MissingFields: Failure calling constructor method of class `AppConfig`. Missing values for required dataclass fields.
  have fields: []
  missing fields: ['milvus', 'triton']
  input JSON object: {}
  error: AppConfig.__init__() missing 2 required positional arguments: 'milvus' and 'triton'

Hi @mahantesh.meti – which workflow are you trying to deploy here?

Hi,

I’m trying to deploy aiworkflows multimodal chat bot. Already Nemo inference and embedding is running, We need to deploy langchain and rag playground applications.

Thanks
-Mahantesh