Object2Vec에 대한 인코더 임베딩 - 아마존 SageMaker

기계 번역으로 제공되는 번역입니다. 제공된 번역과 원본 영어의 내용이 상충하는 경우에는 영어 버전이 우선합니다.

Object2Vec에 대한 인코더 임베딩

GPU 최적화: 인코더 임베딩

임베딩은 단어 같은 개별 객체에서 실수 벡터로의 매핑을 뜻합니다.

GPU 메모리 희소성으로 인해 INFERENCE_PREFERRED_MODE 환경 변수를 지정하여 Object2Vec 추론을 위한 데이터 형식 또는 인코더 임베딩 추론 네트워크가 GPU에 로드되는지 여부를 최적화할 수 있습니다. 대부분의 추론이 인코더 임베딩에 사용되는 경우 INFERENCE_PREFERRED_MODE=embedding을 지정합니다. 다음은 4개의 p3.2xlarge 인스턴스를 사용하는 인코더 임베딩 추론에 최적화된 배치 변환의 예제입니다.

transformer = o2v.transformer(instance_count=4, instance_type="ml.p2.xlarge", max_concurrent_transforms=2, max_payload=1, # 1MB strategy='MultiRecord', env={'INFERENCE_PREFERRED_MODE': 'embedding'}, # only useful with GPU output_path=output_s3_path)

입력: 인코더 임베딩

Content-type: application/json; infer_max_seqlens=<FWD-LENGTH>,<BCK-LENGTH>

여기서 <FWD-LENGTH> 및 <BCK-LENGTH>는 [1,5000] 범위의 정수이며 순방향 및 역방향 인코더의 최대 시퀀스 길이를 정의합니다.

{ "instances" : [ {"in0": [6, 17, 606, 19, 53, 67, 52, 12, 5, 10, 15, 10178, 7, 33, 652, 80, 15, 69, 821, 4]}, {"in0": [22, 1016, 32, 13, 25, 11, 5, 64, 573, 45, 5, 80, 15, 67, 21, 7, 9, 107, 4]}, {"in0": [774, 14, 21, 206]} ] }

Content-type: application/jsonlines; infer_max_seqlens=<FWD-LENGTH>,<BCK-LENGTH>

여기서 <FWD-LENGTH> 및 <BCK-LENGTH>는 [1,5000] 범위의 정수이며 순방향 및 역방향 인코더의 최대 시퀀스 길이를 정의합니다.

{"in0": [6, 17, 606, 19, 53, 67, 52, 12, 5, 10, 15, 10178, 7, 33, 652, 80, 15, 69, 821, 4]} {"in0": [22, 1016, 32, 13, 25, 11, 5, 64, 573, 45, 5, 80, 15, 67, 21, 7, 9, 107, 4]} {"in0": [774, 14, 21, 206]}

이러한 형식 둘 다에서는 “in0” 또는 “in1.” 중 한 가지 입력 형식만 지정합니다. 그러면 추론 서비스에서 해당하는 인코더를 호출해 각 인스턴스에 대한 임베딩을 호출합니다.

출력: 인코더 임베딩

Content-type: application/json

{ "predictions": [ {"embeddings":[0.057368703186511,0.030703511089086,0.099890425801277,0.063688032329082,0.026327300816774,0.003637571120634,0.021305780857801,0.004316598642617,0.0,0.003397724591195,0.0,0.000378780066967,0.0,0.0,0.0,0.007419463712722]}, {"embeddings":[0.150190666317939,0.05145975202322,0.098204270005226,0.064249359071254,0.056249320507049,0.01513972133398,0.047553978860378,0.0,0.0,0.011533712036907,0.011472506448626,0.010696629062294,0.0,0.0,0.0,0.008508535102009]} ] }

Content-type: application/jsonlines

{"embeddings":[0.057368703186511,0.030703511089086,0.099890425801277,0.063688032329082,0.026327300816774,0.003637571120634,0.021305780857801,0.004316598642617,0.0,0.003397724591195,0.0,0.000378780066967,0.0,0.0,0.0,0.007419463712722]} {"embeddings":[0.150190666317939,0.05145975202322,0.098204270005226,0.064249359071254,0.056249320507049,0.01513972133398,0.047553978860378,0.0,0.0,0.011533712036907,0.011472506448626,0.010696629062294,0.0,0.0,0.0,0.008508535102009]}

추론 서비스에서 출력한 임베딩의 벡터 길이는 훈련 시 지정한 enc0_token_embedding_dim, enc1_token_embedding_dim 또는 enc_dim 하이퍼파라미터 중 하나와 동일합니다.