Skip to main content

Posts

Showing posts from August, 2020

Connecting to Salesforce using Python [aiosfstream]

Connect to Salesforce Streaming Library using python to consume Salesforce Objects.  Library used: aiosfstream Ref link: https://aiosfstream.readthedocs.io/en/latest/quickstart.html#connecting   Quick start:   Authentication : To connect to salesforce streaming API, all clients must authenticate themselves. supports various ways: Username - Password authentication (using SalesforceStreamingClient )   client= SalesforceStreamingClient ( consumer_key="<consumer key>", consumer_secret = "<consumer secret>", username="<username>", password = "<password>" ) # client = Client(auth)     Refresh token authentication   auth = RefreshTokenAuthenticator ( consumer_key = "<consumer key>", consumer_secret = "<consumer secret>", refresh_token = "<refresh_token>" ) client = Client(auth)   Authentication on sand...

Copy data from S3 to Aurora Postgres

Scenario 1: To copy data from S3 to Aurora Postgres (greater than v9 or latest) How ?: We can use aws_s3.table_import_from_s3 function (to migrate the data from S3 to Aurora Postgres).  Steps: A sample file with columns - id, prefix, mstr_id is copied to S3.  Create schema on Aurora Postgres (with the required columns).  drop table core . mstr ; CREATE TABLE core .mstr ( id varchar ( 300 ) NULL , prefix varchar ( 300 ) NULL , mstr_id float8 NULL ); Copy command to transfer the data from S3 to Aurora Postgres SELECT aws_s3 . table_import_from_s3 ( ' core.MSTR ' , ' id,prefix,mstr_id ' , ' (format csv, header true) ' , '<bucket-name> ' , ' MSTR_DATA/part_file_00.csv ' , ' us-east-2 ' , '<secret key> ' , '<access key> ' ); Note: If IAM roles are given, we need not specify access keys.  SELECT aws_s3 . table_import_from_s3 ( ' core.MSTR ' , ' id,prefix,mst...