Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't run inference on AWS Lambda #356

Open
1 of 2 tasks
DominiquePaul opened this issue Apr 13, 2024 · 1 comment
Open
1 of 2 tasks

Can't run inference on AWS Lambda #356

DominiquePaul opened this issue Apr 13, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@DominiquePaul
Copy link

DominiquePaul commented Apr 13, 2024

Search before asking

  • I have searched the Inference issues and found no similar bug report.

Bug

I am running this code on AWS Lambda

import os
from inference_sdk import InferenceHTTPClient

def handler(event, context):
    client = InferenceHTTPClient(api_url="https://detect.roboflow.com",
                                 api_key=os.environ["ROBOFLOW_API_KEY"])
    img_path = "./pizza.jpg"
    return client.infer(img_path, model_id="pizza-identifier/3")

As part of a docker container that looks like this:

FROM public.ecr.aws/lambda/python:3.11

RUN yum install -y mesa-libGL

COPY requirements.txt ${LAMBDA_TASK_ROOT}

RUN pip install -r requirements.txt

COPY pizza.jpg ${LAMBDA_TASK_ROOT}

COPY lambda_function.py ${LAMBDA_TASK_ROOT}

CMD [ "lambda_function.handler" ]

My requirements.txt contains nothing but inference==0.9.17

When the code runs I get the following error. I have been trying to fix this and tried workarounds but to no avail. I understand that the error is somehow related to multiprocessing. I found this post from which I understand that multiprocessing isn't possible on AWS Lambda, however, my script does not control or trigger any multiprocessing.

This is the full error:

{
  "errorMessage": "[Errno 38] Function not implemented",
  "errorType": "OSError",
  "requestId": "703be804-fd86-4b44-88f9-ac54c87717be",
  "stackTrace": [
    "  File \"/var/task/lambda_function.py\", line 10, in handler\n    return client.infer(img_path, model_id=\"pizza-identifier/3\")\n",
    "  File \"/var/lang/lib/python3.11/site-packages/inference_sdk/http/client.py\", line 82, in decorate\n    return function(*args, **kwargs)\n",
    "  File \"/var/lang/lib/python3.11/site-packages/inference_sdk/http/client.py\", line 237, in infer\n    return self.infer_from_api_v0(\n",
    "  File \"/var/lang/lib/python3.11/site-packages/inference_sdk/http/client.py\", line 299, in infer_from_api_v0\n    responses = execute_requests_packages(\n",
    "  File \"/var/lang/lib/python3.11/site-packages/inference_sdk/http/utils/executors.py\", line 42, in execute_requests_packages\n    responses = make_parallel_requests(\n",
    "  File \"/var/lang/lib/python3.11/site-packages/inference_sdk/http/utils/executors.py\", line 58, in make_parallel_requests\n    with ThreadPool(processes=workers) as pool:\n",
    "  File \"/var/lang/lib/python3.11/multiprocessing/pool.py\", line 930, in __init__\n    Pool.__init__(self, processes, initializer, initargs)\n",
    "  File \"/var/lang/lib/python3.11/multiprocessing/pool.py\", line 196, in __init__\n    self._change_notifier = self._ctx.SimpleQueue()\n",
    "  File \"/var/lang/lib/python3.11/multiprocessing/context.py\", line 113, in SimpleQueue\n    return SimpleQueue(ctx=self.get_context())\n",
    "  File \"/var/lang/lib/python3.11/multiprocessing/queues.py\", line 341, in __init__\n    self._rlock = ctx.Lock()\n",
    "  File \"/var/lang/lib/python3.11/multiprocessing/context.py\", line 68, in Lock\n    return Lock(ctx=self.get_context())\n",
    "  File \"/var/lang/lib/python3.11/multiprocessing/synchronize.py\", line 169, in __init__\n    SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx)\n",
    "  File \"/var/lang/lib/python3.11/multiprocessing/synchronize.py\", line 57, in __init__\n    sl = self._semlock = _multiprocessing.SemLock(\n"
  ]
}

Environment

No response

Minimal Reproducible Example

No response

Additional

I am incredibly frustrated since I've been working on this for 9 hours now and would appreciate any hints!

Are you willing to submit a PR?

  • Yes I'd like to help by submitting a PR!
@DominiquePaul DominiquePaul added the bug Something isn't working label Apr 13, 2024
@PawelPeczek-Roboflow
Copy link
Collaborator

hi,
Thanks for pointing this out. Your code explicitly does not do multiprocessing but the client does (actually it does threading).
We were not aware about the problem as we run infer_async(...) within our lambdas.
We will take a look, but probably for this to be resolved there is some time needed.

As a walk-around you may take a look here: https://stackoverflow.com/questions/60455830/can-you-have-an-async-handler-in-lambda-python-3-6
should be possible to run async coroutine (await client.infer_async(...)) within lambda handler. That should work.

Sorry for inconvenience and thanks for raising this bug

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants