AWS Code Sample
Catalog demonstrates how to detect unsafe content in an image loaded from an S3 Bucket.

# Copyright 2010-2019, Inc. or its affiliates. All Rights Reserved. # # This file is licensed under the Apache License, Version 2.0 (the "License"). # You may not use this file except in compliance with the License. A copy of the # License is located at # # # # This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS # OF ANY KIND, either express or implied. See the License for the specific # language governing permissions and limitations under the License. import boto3 if __name__ == "__main__": # Change the values of photo and bucket to your values. photo='moderate.png' bucket='bucket' client=boto3.client('rekognition') response = client.detect_moderation_labels(Image={'S3Object':{'Bucket':bucket,'Name':photo}}) print('Detected labels for ' + photo) for label in response['ModerationLabels']: print (label['Name'] + ' : ' + str(label['Confidence'])) print (label['ParentName'])

Sample Details

Service: rekognition

Last tested: 2019-01-3

Author: reesch (AWS)

Type: full-example

On this page: