I'm attempting to deploy a local tensorflow model to a sagemaker endpoint. I want to include custom inference code to transform the input data. Given the model is trained and already located in an S3 bucket, I can run the following, which correctly deploys the model to an endpoint:
tensorflow_model = Model(
model_data=saved_model,
entry_point='src/development/document_matcher/inference.py',
source_dir ='./src/development/document_matcher',
role=role,
framework_version=tf_framework_version
)
The inference file is working correctly; however, I also need the vocabulary for the model to transform the incoming data correctly. This is what I'm trying to upload with source_dir.
My Directory structure is as follows:
- src
- development
-document_matcher
- inference.py
- total_vocab.pkl
I need to have total_vocab.pkl accessible to my inference script when it runs. However, inference.py is unable to find it. Am I misunderstanding how sagemaker works?