- Trending Categories
Data Structure
Networking
RDBMS
Operating System
Java
MS Excel
iOS
HTML
CSS
Android
Python
C Programming
C++
C#
MongoDB
MySQL
Javascript
PHP
Physics
Chemistry
Biology
Mathematics
English
Economics
Psychology
Social Studies
Fashion Studies
Legal Studies
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
How to use Boto3 to paginate through all crawlers present in AWS Glue
In this article, we will see how to paginate through all crawlers present in AWS Glue.
Example
Paginate through all crawlers from AWS Glue Data Catalog that is created in your account.
Problem Statement: Use boto3 library in Python to paginate through all crawlers from AWS Glue Data Catalog that is created in your account
Approach/Algorithm to solve this problem
Step 1: Import boto3 and botocore exceptions to handle exceptions.
Step 2: max_items, page_size and starting_token are the parameters for this function.
max_items denote the total number of records to return. If the number of available records > max_items, then a NextToken will be provided in the response to resume pagination.
page_size denotes the size of each page.
Starting_token helps to paginate, and it uses NextToken from a previous response.
Step 3: Create an AWS session using boto3 lib. Make sure region_name is mentioned in the default profile. If it is not mentioned, then explicitly pass the region_name while creating the session.
Step 4: Create an AWS client for glue.
Step 5: Create a paginator object that contains details of all crawlers using get_crawlers.
Step 6: Call the paginate function and pass the max_items, page_size and starting_token as PaginationConfig parameter.
Step 7: It returns the number of records based on max_size and page_size.
Step 8: Handle the generic exception if something went wrong while paginating.
Example Code
Use the following code to paginate through all crawlers created in user account −
import boto3 from botocore.exceptions import ClientError def paginate_through_crawlers(max_items=None:int,page_size=None:int, starting_token=None:string): session = boto3.session.Session() glue_client = session.client('glue') try: paginator = glue_client.get_paginator('get_crawlers') response = paginator.paginate(PaginationConfig={ 'MaxItems':max_items, 'PageSize':page_size, 'StartingToken':starting_token} ) return response except ClientError as e: raise Exception("boto3 client error in paginate_through_crawlers: " + e.__str__()) except Exception as e: raise Exception("Unexpected error in paginate_through_crawlers: " + e.__str__()) #1st Run a = paginate_through_crawlers(3,5) print(*a) #2nd Run for items in a: next_token = (items['NextToken']) b = paginate_through_crawlers(3,5,next_token)
Output
#1st Run {'Crawlers': [{'Name': 'DailyTest_v1.01', 'Role': 'ds-dev', 'Targets': {'S3Targets': [{'Path': 's3://**************/UIT_Raw/', 'Exclusions': []}], 'JdbcTargets': [], 'DynamoDBTargets': [], 'CatalogTargets': []}, 'DatabaseName': 'default', 'Classifiers': [], 'SchemaChangePolicy': {'UpdateBehavior': 'UPDATE_IN_DATABASE', 'DeleteBehavior': 'DEPRECATE_IN_DATABASE'}, 'State': 'READY', 'TablePrefix': 'test_uit_', 'CrawlElapsedTime': 0, 'CreationTime': datetime.datetime(2020, 11, 23, 17, 50, 20, tzinfo=tzlocal()), 'LastUpdated': datetime.datetime(2020, 12, 11, 18, 22, 34, tzinfo=tzlocal()), 'LastCrawl': {'Status': 'SUCCEEDED', 'LogGroup': '/aws-glue/crawlers', 'LogStream': '01_DailySalesAsOff_v1.01', 'MessagePrefix': '71fc0485-4755-42ca-a208-0654dd84d011', 'StartTime': datetime.datetime(2020, 12, 11, 18, 54, 46, tzinfo=tzlocal())}, 'Version': 10}, {'Name': 'Client_list', 'Role': 'ds-dev', 'Targets': {'S3Targets': [{'Path': 's3://************Client_list_01072021/', 'Exclusions': []}], 'JdbcTargets': [], 'DynamoDBTargets': [], 'CatalogTargets': []}, 'DatabaseName': 'ds_adhoc', 'Classifiers': [], 'SchemaChangePolicy': {'UpdateBehavior': 'UPDATE_IN_DATABASE', 'DeleteBehavior': 'DEPRECATE_IN_DATABASE'}, 'State': 'READY', 'CrawlElapsedTime': 0, 'CreationTime': datetime.datetime(2021, 1, 8, 3, 52, 27, tzinfo=tzlocal()), 'LastUpdated': datetime.datetime(2021, 1, 8, 3, 52, 27, tzinfo=tzlocal()), 'LastCrawl': {'Status': 'SUCCEEDED', 'LogGroup': '/aws-glue/crawlers', 'LogStream': 'Client_list_01072021', 'MessagePrefix': '41733a73-8946-4906-969c-f9581237b833', 'StartTime': datetime.datetime(2021, 1, 8, 3, 52, 45, tzinfo=tzlocal())}, 'Version': 1}, {'Name': 'Data Dimension', 'Role': 'qa-edl-glue-role', 'Targets': {'S3Targets': [{'Path': 's3://**************/data_dimension', 'Exclusions': []}], 'JdbcTargets': [], 'DynamoDBTargets': [], 'CatalogTargets': []}, 'DatabaseName': 'qa_edl_glue_database', 'Classifiers': [], 'SchemaChangePolicy': {'UpdateBehavior': 'UPDATE_IN_DATABASE', 'DeleteBehavior': 'DEPRECATE_IN_DATABASE'}, 'State': 'READY', 'CrawlElapsedTime': 0, 'CreationTime': datetime.datetime(2020, 8, 12, 0, 36, 21, tzinfo=tzlocal()), 'LastUpdated': datetime.datetime(2021, 3, 28, 13, 21, 19, tzinfo=tzlocal()), 'LastCrawl': {'Status': 'SUCCEEDED', 'LogGroup': '/aws-glue/crawlers', 'LogStream': 'Data Dimension', 'MessagePrefix': 'ee09c0ac-b778-467e-a941-c86c37edde47', 'StartTime': datetime.datetime(2021, 3, 28, 14, 1, 50, tzinfo=tzlocal())}, 'Version': 11}], 'NextToken': 'crawlr-wells', 'ResponseMetadata': {'RequestId': '8a6114ec-************d66', 'HTTPStatusCode': 200, 'HTTPHeaders': {'date': 'Fri, 02 Apr 2021 11:00:17 GMT', 'content-type': 'application/x-amz-json-1.1', 'content-length': '86627', 'connection': 'keep-alive', 'x-amzn-requestid': '8a6114ec-*****************66'}, 'RetryAttempts': 0}} #2nd Run {'Crawlers': [ {'Name': 'crwlr-cw-etf', 'Role': 'dev-ds-glue-role', 'Targets': {'S3Targets': [{'Path': 's3://***********CW_2020Q3', 'Exclusions': []}], 'JdbcTargets': [], 'DynamoDBTargets': [], 'CatalogTargets': []}, 'DatabaseName': 'ivz-dev-ds-data-packs', 'Description': 'Data pack crawlers', 'Classifiers': [], 'SchemaChangePolicy': {'UpdateBehavior': 'UPDATE_IN_DATABASE', 'DeleteBehavior': 'DEPRECATE_IN_DATABASE'}, 'State': 'READY', 'CrawlElapsedTime': 0, 'CreationTime': datetime.datetime(2020, 10, 28, 17, 30, 41, tzinfo=tzlocal()), 'LastUpdated': datetime.datetime(2020, 11, 17, 12, 47, 21, tzinfo=tzlocal()), 'LastCrawl': {'Status': 'SUCCEEDED', 'LogGroup': '/aws-glue/crawlers', 'LogStream': crwlr-cw-etf', 'MessagePrefix': '49cb001f-3005-43ef-96f7-a1d45917c808', 'StartTime': datetime.datetime(2020, 11, 17, 17, 41, 16, tzinfo=tzlocal())}, 'Version': 5, 'Configuration': '{"Version":1.0,"Grouping":{"TableGroupingPolicy":"CombineCompatibleSchemas"}}'}, {'Name': 'crwlr-data-packs', 'Role': 'dev-ds-glue-role', 'Targets': {'S3Targets': [{'Path': 's3://*****************raw-parquet', 'Exclusions': []}], 'JdbcTargets': [], 'DynamoDBTargets': [], 'CatalogTargets': []}, 'DatabaseName': 'ivz-dev-ds-data-packs', 'Classifiers': [], 'SchemaChangePolicy': {'UpdateBehavior': 'UPDATE_IN_DATABASE', 'DeleteBehavior': 'DEPRECATE_IN_DATABASE'}, 'State': 'READY', 'CrawlElapsedTime': 0, 'CreationTime': datetime.datetime(2020, 4, 21, 20, 49, 6, tzinfo=tzlocal()), 'LastUpdated': datetime.datetime(2020, 4, 21, 20, 49, 6, tzinfo=tzlocal()), 'LastCrawl': {'Status': 'SUCCEEDED', 'LogGroup': '/aws-glue/crawlers', 'LogStream': 'crwlr-data-packs-mstr', 'MessagePrefix': '26aedfe8-f631-41e3-acd5-35877d71be1b', 'StartTime': datetime.datetime(2020, 4, 21, 20, 49, 10, tzinfo=tzlocal())}, 'Version': 1, 'Configuration': '{"Version":1.0,"Grouping":{"TableGroupingPolicy":"CombineCompatibleSchemas"}}'}, {'Name': 'crwlr-data-packs', 'Role': 'dev-ds-glue-role', 'Targets': {'S3Targets': [{'Path': 's3://******ubs-raw-parquet', 'Exclusions': []}], 'JdbcTargets': [], 'DynamoDBTargets': [], 'CatalogTargets': []}, 'DatabaseName': 'ivz-dev-ds-data-packs', 'Classifiers': [], 'SchemaChangePolicy': {'UpdateBehavior': 'UPDATE_IN_DATABASE', 'DeleteBehavior': 'DEPRECATE_IN_DATABASE'}, 'State': 'READY', 'CrawlElapsedTime': 0, 'CreationTime': datetime.datetime(2020, 4, 14, 2, 31, 25, tzinfo=tzlocal()), 'LastUpdated': datetime.datetime(2020, 5, 28, 21, 52, 4, tzinfo=tzlocal()), 'LastCrawl': {'Status': 'SUCCEEDED', 'LogGroup': '/aws-glue/crawlers', 'LogStream': 'ivz-dev-ds-crwlr-data-packs-ubs', 'MessagePrefix': '6c00dc7b-181e-4eb2-8d6d-d195f97b03ce', 'StartTime': datetime.datetime(2020, 6, 4, 16, 13, 18, tzinfo=tzlocal())}, 'Version': 5, 'Configuration': '{"Version":1.0,"Grouping":{"TableGroupingPolicy":"CombineCompatibleSchemas"}}'}], 'NextToken': 'discovery_rep', 'ResponseMetadata': {'RequestId': '43e0b162-***********, 'HTTPStatusCode': 200, 'HTTPHeaders': {'date': 'Fri, 02 Apr 2021 11:00:18 GMT', 'content-type': 'application/x-amz-json-1.1', 'content-length': '89110', 'connection': 'keep-alive', 'x-amzn-requestid': '43e0b162-********}, 'RetryAttempts': 0}}
- Related Articles
- How to use Boto3 to paginate through all jobs present in AWS Glue
- How to use Boto3 to paginate through all tables present in AWS Glue
- How to use Boto3 to paginate through all triggers present in AWS Glue
- How to use Boto3 to to paginate through all databases present in AWS Glue
- How to use Boto3 to paginate through security configuration present in AWS Glue
- How to use Boto3 to paginate through all objects of a S3 bucket present in AWS Glue
- How to use Boto3 to paginate through table versions of a table present in AWS Glue
- How to use Boto3 to paginate through the job runs of a job present in AWS Glue
- How to use Boto3 to paginate through object versions of a S3 bucket present in AWS Glue
- How to use Boto3 to paginate through multi-part upload objects of a S3 bucket present in AWS Glue
- How to get the list of all crawlers present in an AWS account using Boto3
- How to use Boto3 to delete a glue job from AWS Glue?
- How to use Boto3 to get the details of allsecurity configuration present in AWS Glue Security?
- How to use Boto3 to add tags in AWS Glue Resources
- How to use Boto3 to remove tags from AWS Glue Resources
