Skip to content

by usingin #1546

@saisubramani

Description

@saisubramani

Describe the bug
i am trying to deploy a pretrained keras model in amazon sage maker,

To reproduce

  1. I converted the keras model into tensorflow model format as sagemkaer supported one.
    [Link: https://aws.amazon.com/blogs/machine-learning/deploy-trained-keras-or-tensorflow-models-using-amazon-sagemaker/]
  2. I created a code folder and added inference.py and requirements.txt file
  3. The inference script as mentioned in this link, i created with some changes for my model.
    Link : https://github.com/aws/sagemaker-tensorflow-serving-container#prepost-processing

`# inference.py

import tensorflow as tf
import numpy as np
import requests
import base64
from PIL import Image
import io
from collections import namedtuple
import cv2
import json
import os
 
JPEG_CONTENT_TYPE = 'application/x-image'
JSON_CONTENT_TYPE = 'application/json'

Context = namedtuple

def input_handler(data, context):
    #if context.request_content_type == 'application/x-image':
    if context== 'application/x-image':
        #payload = data.read()
        #encoded_image = base64.b64encode(payload).decode('utf-8')
        image = Image.open(io.BytesIO(data))
        result_image  =cv2.cvtColor(np.array(image), cv2.COLOR_BGR2RGB)
        img_cpy=cv2.resize(result_image, (300,300))
        img_cpy=img_cpy/255.
        img_cpy=np.expand_dims(img_cpy,0)
        print(type(img_cpy))
        encoded_image = base64.b64encode(img_cpy).decode('utf-8')
        print(type(encoded_image))
        instance = [{"b64": encoded_image}]
        return json.dumps({"instances": instance})
    else:
        _return_error(415, 'Unsupported content type "{}"'.format(
            context.request_content_type or 'Unknown'))


def output_handler(data, context):

	response = json.loads(data.content.decode('utf-8'))
    pred = response.get('predictions')
    prediction=np.argmax(pred)
    prediction = str(prediction)
    return json.dumps({'classification_id':prediction})


def _return_error(code, message):
    raise ValueError('Error: {}, {}'.format(str(code), message))
  1. i zipped (tared) the model folder into tar.gz format and uploaded in s3 bucket.
  2. Then i open a sagemaker notebooke instance to deply, i uploaded the code folder in the notebook instance also then i started to deploy it.
    It show error :

2020-06-03T15:21:54.839+05:30 | /usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_qint16 = np.dtype([("qint16", np.int16, 1)])

2020-06-03T15:21:54.840+05:30 | Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/gunicorn/arbiter.py", line 583, in spawn_worker worker.init_process() File "/usr/local/lib/python3.5/dist-packages/gunicorn/workers/ggevent.py", line 203, in init_process super(GeventWorker, self).init_process() File "/usr/local/lib/python3.5/dist-packages/gunicorn/workers/base.py", line 129, in init_process self.load_wsgi() File "/usr/local/lib/python3.5/dist-packages/gunicorn/workers/base.py", line 138, in load_wsgi self.wsgi = self.app.wsgi() File "/usr/local/lib/python3.5/dist-packages/gunicorn/app/base.py", line 67, in wsgi self.callable = self.load() File "/usr/local/lib/python3.5/dist-packages/gunicorn/app/wsgiapp.py", line 52, in load return self.load_wsgiapp() File "/usr/local/lib/python3.5/dist-packages/gunicorn/app/wsgiapp.py", line 41, in load_wsgiapp return util.import_app(self.app_uri) File "/usr/local/lib/python3.5/dist-packages/gunicorn/util.py", line 350, in import_app import(module) File "/sagemaker/python_service.py", line 239, in resources.add_routes(app) File "/sagemaker/python_service.py", line 227, in add_routes invocation_resource = InvocationResource() File "/sagemaker/python_service.py", line 48, in init self._handler, self._input_handler, self._output_handler = self._import_handlers() File "/sagemaker/python_service.py", line 68, in _import_handlers spec.loader.exec_module(inference) File "/opt/ml/model/code/inference.py", line 8, in import cv2 File "/usr/local/lib/python3.5/dist-packages/cv2/init.py", line 5, in from .cv2 import *

*Expected behavior

i need to check the inference, when i pass the image by using invoke endpoint api. it should return the json response.

System information
A description of your system. Please provide:

  • SageMaker Python SDK version:
  • Framework name (eg. PyTorch) or algorithm (eg. KMeans): tensorflow backed keras classification
  • Framework version:1.14 or 1.12
  • Python version:3
  • CPU or GPU:GPU
  • Custom Docker image (Y/N):N

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions