mirror of
https://github.com/hastagAB/Awesome-Python-Scripts.git
synced 2024-11-23 20:11:07 +00:00
Added script to send messages to sqs in parallel (#165)
* Added script to send messages to sqs in parallel * Update README.md
This commit is contained in:
parent
fa07478deb
commit
77f5f99296
|
@ -152,6 +152,7 @@ So far, the following projects have been integrated to this repo:
|
||||||
|[Random_Email_Generator](Random_Email_Generator)|[Shubham Garg](https://github.com/shub-garg)|
|
|[Random_Email_Generator](Random_Email_Generator)|[Shubham Garg](https://github.com/shub-garg)|
|
||||||
|[Tambola_Ticket_Generator](Tambola_Ticket_Generator)|[Amandeep_Singh](https://github.com/Synster)|
|
|[Tambola_Ticket_Generator](Tambola_Ticket_Generator)|[Amandeep_Singh](https://github.com/Synster)|
|
||||||
| [Py_Cleaner](Py_Cleaner) | [Abhishek Dobliyal](https://github.com/Abhishek-Dobliyal)
|
| [Py_Cleaner](Py_Cleaner) | [Abhishek Dobliyal](https://github.com/Abhishek-Dobliyal)
|
||||||
|
|[Send messages to sqs in parallel](send_sqs_messages_in_parallel)|[Jinam Shah](https://github.com/jinamshah)|
|
||||||
|
|
||||||
|
|
||||||
## How to use :
|
## How to use :
|
||||||
|
|
19
send_sqs_messages_in_parallel/README.md
Normal file
19
send_sqs_messages_in_parallel/README.md
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
# Send messages to sqs in parallel
|
||||||
|
|
||||||
|
A python script that will take a file with a lot of messages to be sent to sqs and will send them to a queue in highly parallelized manner.
|
||||||
|
<br>
|
||||||
|
This is especially useful for batch processing requests.
|
||||||
|
<br>
|
||||||
|
Works when ```aws configure``` is done correctly or iam role is attached to the machine
|
||||||
|
|
||||||
|
## Requirement
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip3 install boto3
|
||||||
|
```
|
||||||
|
|
||||||
|
#Usage
|
||||||
|
Go to Upload_files_to_s3 directory and add your folder's name you want to upload to s3 and then run upload_files_to_s3.py as below:
|
||||||
|
```bash
|
||||||
|
$ python3 send_to_sqs.py
|
||||||
|
```
|
1
send_sqs_messages_in_parallel/requirements.txt
Normal file
1
send_sqs_messages_in_parallel/requirements.txt
Normal file
|
@ -0,0 +1 @@
|
||||||
|
boto3==1.10.50
|
37
send_sqs_messages_in_parallel/send_to_sqs.py
Normal file
37
send_sqs_messages_in_parallel/send_to_sqs.py
Normal file
|
@ -0,0 +1,37 @@
|
||||||
|
import boto3
|
||||||
|
import json
|
||||||
|
import sys
|
||||||
|
from multiprocessing import Pool
|
||||||
|
|
||||||
|
file = '/path/to/file_with_records_as_jsonl'
|
||||||
|
queue_url = 'your-queue-url'
|
||||||
|
region = 'your-region'
|
||||||
|
processor_count = int(sys.argv[1])
|
||||||
|
|
||||||
|
|
||||||
|
def send_message(data):
|
||||||
|
sqs_client = boto3.client('sqs', region_name=region)
|
||||||
|
sqs_client.send_message_batch(
|
||||||
|
QueueUrl=queue_url,
|
||||||
|
Entries=[{'MessageBody': json.dumps(x)} for x in data]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def main(file):
|
||||||
|
temp = []
|
||||||
|
total_records_sent = 0
|
||||||
|
|
||||||
|
with open(file) as f:
|
||||||
|
data = [json.loads(line) for line in f]
|
||||||
|
batched_data = []
|
||||||
|
for i in range(0, len(data), int(len(data)/processor_count)):
|
||||||
|
batched_data.append(data[i:i + int(len(data)/processor_count)])
|
||||||
|
for _ in Pool(processes=processor_count).imap_unordered(send_message,
|
||||||
|
batched_data):
|
||||||
|
temp.append(_)
|
||||||
|
for x in temp:
|
||||||
|
total_records_sent += x
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main(file)
|
Loading…
Reference in New Issue
Block a user