aws lambda python 上传s3
2023-04-18 15:56:30 时间
代码编写
Code writing
编写lambda函数
Write lambda functions
主要功能是查询数据库,在本地生成test.csv,而后上传至s3://test-bucket-dev桶,bthlt目录下. test.csv is generated locally and uploaded to s3://test-bucket-dev bucket,bthlt path.
import pymysql
import logging
import boto3
from botocore.exceptions import ClientError
import os
db = pymysql.connect(host='****.****',
user='****',
password='****',
database='****')
cursor = db.cursor()
def cursor_query_all(sql):
try:
cursor.execute(sql)
except Exception as e:
print("Catch exception : " + str(e))
return cursor.fetchall()
def get_db_data():
sql = """select * from test"""
result = cursor_query_all(sql)
return result
def upload_file(file_name, bucket, object_name=None):
if object_name is None:
object_name = os.path.basename(file_name)
s3_client = boto3.client('s3')
try:
s3_client.upload_file(file_name, bucket, object_name)
except ClientError as e:
logging.error(e)
return False
return True
def lambda_handler(event, context):
with open('/tmp/test.csv', 'w') as v_file:
results = get_db_data()
v_file.write('head1,head2,head3' + '
')
for result in results:
v_file.write(','.join(result) + '
')
upload_file('/tmp/test.csv', 'test-bucket-dev', 'bthlt/test.csv')
terraform 部署
terraform deployment
编写依赖requirements.txt文件
write the dependency requirements.txt file
requirements.txt
boto3==1.20.23
PyMySQL==1.0.2
botocore==1.23.23
编写打包函数并运行,生成test.zip
Write the package function and run it to generate test.zip
package.sh
#!/bin/bash
mkdir deploy
cp test.py deploy
cp requirements.txt deploy
cd deploy
pip install -r requirements.txt -t./
zip -r ../test.zip *
rm -rf ../deploy
编写terraform,实现自动上传,并通过aws event每日定时执行
Write terraform to realize automatic upload and execute it daily through AWS Events
lambda.tf
resource "aws_iam_role" "test" {
assume_role_policy = jsonencode(
{
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "lambda.amazonaws.com"
}
},
]
Version = "2021-10-17"
}
)
force_detach_policies = false
max_session_duration = 3600
name = "test"
path = "/service-role/"
}
resource "aws_lambda_function" "test" {
function_name = "test-upload-s3"
handler = "test.lambda_handler"
role = aws_iam_role.test.arn
runtime = "python3.8"
memory_size = "128"
filename = "lambda/test.zip"
source_code_hash = filebase64sha256("lambda/test.zip")
}
event.tf
resource "aws_cloudwatch_event_rule" "every_day_upload_file_hours" {
name = "test-file-every-day-${terraform.workspace}"
schedule_expression = "cron(0 1 * * ? *)"
}
resource "aws_cloudwatch_event_target" "event_target_upload_files_s3" {
count = terraform.workspace == "prod" ? 1 : 0
target_id = "every_day_upload_file_hours"
rule = aws_cloudwatch_event_rule.every_day_upload_file_hours.name
arn = "arn:aws-cn:lambda:region:account_id:function:test-upload-s3"
}
resource "aws_lambda_permission" "lambda_permission_upload_files_s3" {
count = terraform.workspace == "prod" ? 1 : 0
action = "lambda:InvokeFunction"
function_name = "test-upload-s3"
principal = "events.amazonaws.com"
source_arn = aws_cloudwatch_event_rule.every_day_upload_file_hours.arn
}
相关文章
- 【技术种草】cdn+轻量服务器+hugo=让博客“云原生”一下
- CLB运维&运营最佳实践 ---访问日志大洞察
- vnc方式登陆服务器
- 轻松学排序算法:眼睛直观感受几种常用排序算法
- 十二个经典的大数据项目
- 为什么使用 CDN 内容分发网络?
- 大数据——大数据默认端口号列表
- Weld 1.1.5.Final,JSR-299 的框架
- JavaFX 2012:彻底开源
- 提升as3程序性能的十大要点
- 通过凸面几何学进行独立于边际的在线多类学习
- 利用行动影响的规律性和部分已知的模型进行离线强化学习
- ModelLight:基于模型的交通信号控制的元强化学习
- 浅谈Visual Source Safe项目分支
- 基于先验知识的递归卡尔曼滤波的代理人联合状态和输入估计
- 结合网络结构和非线性恢复来提高声誉评估的性能
- 最佳实践丨云开发CloudBase多环境管理实践
- TimeVAE:用于生成多变量时间序列的变异自动编码器
- 具有线性阈值激活的神经网络:结构和算法
- 内网渗透之横向移动 -- 从域外向域内进行密码喷洒攻击