Questions tagged [amazon-s3]

11460 questions
1

votes
3

answer
112

Views

ETL to csv files, split up and then pushed to s3 to be consume by redshift

Just getting started with Kiba, didn't find anything obvious, but I could be just channeling my inner child (who looks for their shoes by staring at the ceiling). I want to dump a very large table to Amazon Redshift. It seems that the fastest way to do that is to write out a bunch of CSV files to an...
Ken Mayer
1

votes
1

answer
336

Views

missing required option :name

I am trying to set up AWS, and carrierwave to upload pictures from my website. I keep getting the error 'missing required option :name' when I try to upload/update the posts though. I have followed tutorials to set up my S3 account and to get carrierwave.rb set up. Please let me know if you have any...
Steph Simpson
1

votes
0

answer
712

Views

AmazonS3Exception: Bad Request 400 when trying to use S3AFilesystem

I have java code which tries to initialize a remote filesystem from S3 using coniguration (this was previously on HDFS and I try to move this to s3 without modifying the code too much). This is the config: fs.s3a.aws.credentials.provider=com.amazonaws.auth.DefaultAWSCredentialsProviderChain fs.defau...
1

votes
1

answer
122

Views

Gulp AWS Access Denied issue

This is my first time setting up a Gulp task for publishing and my first time working with AWS in more depth than just using what's already been set up by somebody else. The error I get when I call the task is AccessDenied: Access Denied and my code is: gulp.task('publish-staging', function() { var...
dnmh
1

votes
1

answer
221

Views

Get latest AWS S3 folder when both folder and files inside folder created at same time boto3

im trying to get latest folder in a given s3 prefix using below code For ex: s3a://mybucket/data/timestamp=20180612165132/part1.parquete s3a://mybucket/data/timestamp=20180612165132/part2.parquete s3a://mybucket/data/timestamp=20180613165132/part1.parquete s3a://mybucket/data/timestamp=2018061416513...
shiv455
1

votes
0

answer
190

Views

Amazon S3: getSignedUrl: “Missing required key 'Bucket' in params”

Struggled here for two days. I am kind of new to javascript and AWS so any hint will be appreciate. I have 11 buckets. Others work fine except this one. When I pass in another bucket name and key value, it works, but when I pass in the one I needed I get error: 'Missing required key 'Bucket' in par...
William Shu
1

votes
0

answer
305

Views

Error executing “PutObject” on AWS

I am trying to read .png file from storage path of laravel and upload the same into amazon s3, but i'm getting this below issue: { 'code': 422, 'status': 'error', 'data': { 'exception': 'Error executing \'PutObject\' on \'https://s3.amazonaws.com/mrxmms/123/12345_ach.png\'; AWS HTTP error: count():...
Logeshkumar
9

votes
2

answer
4.5k

Views

AWS Lambda triggered by PUT to s3 bucket in separate account

I am trying to trigger a Lambda function to run on update to a s3 bucket. The s3 bucket that I am attempting to have trigger the Lambda is in a separate AWS account. The approach I have tried is setting up a role in the account that with the s3 bucket that has all the privileges on the s3 bucket. Th...
BBS
1

votes
1

answer
56

Views

Python script that moves specific files between S3 buckets

So I'm still a rookie when it comes to coding in Python, but I was wondering if someone could be so kind as to help me with a problem. A client I work for uses the eDiscovery system Venio. They have a web, app,database, and linux server running off of EC2 instances in AWS. Right now when customers u...
Nick Hopkins
1

votes
0

answer
59

Views

Allow “ObjectCreated” event notification from my zappa app

I'm creating a zappa app so I can perform a lambda function when an object is created in my S3 bucket. At the moment, when trying to update my zappa app via zappa update dev I get this error: botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the GetBucketNotification ope...
Zorgan
1

votes
0

answer
81

Views

Using Paperclip to upload files to S3

I've been trying to setup Paperclip to allow us to send files to S3. I'm not very familiar with S3 or Paperclip, so I've been following the tutorial here at https://devcenter.heroku.com/articles/paperclip-s3 but I've run into some issues. As far as I can tell, I've followed the directions exactly, b...
Ethanph892
1

votes
1

answer
526

Views

Spark Streaming with S3 vs Kinesis

I'm writing a Spark Streaming application where the input data is put into an S3 bucket in small batches (using Database Migration Service - DMS). The Spark application is the only consumer. I'm considering two possible architectures: Have Spark Streaming watch an S3 prefix and pick up new objects a...
lfk
1

votes
1

answer
550

Views

Best SQL client for AWS Athena that supports queries sharing [closed]

I'm looking for the best sql client to use with AWS Athena (with JDBC driver). One of my needs is to share the queries between users, and I would like to know how can I do it in the best way. The options I see are: Using the web Athena query editor, and save the queries in 'Saved Queries' tab. Ge...
Shir
1

votes
0

answer
503

Views

Image Upload to S3 using CKEditor

I am able to drag and drop images to a CKEditor text area and upload the images to an S3 bucket. However, the only message I get from window.parent.CKEDITOR.tools.callFunction is 'undefined' no matter what I do, even if I hard code the response, and the image is removed from the text editor. I also...
user2449001
1

votes
0

answer
117

Views

zfs send of 7TB snapshot to S3 upload results in "An error occurred (Unknown) when calling the CompleteMultipartUpload operation)

I have a CentOS 7.4.1708 server that I am attempting to backup via AWS S3. Kernel is 3.10.0-693.17.1.el7. Since the filesystem I'm trying to backup is ZFS and I have it on scheduled snapshots, I believed I could do a ZFS send to S3 and backup my files that way. This is command I attempted: zfs s...
PolkaRon
1

votes
1

answer
275

Views

Copying files from HDFS to S3 on EMR cluster using S3DistCp

I am copying 800 avro files, size around 136 MB, from HDFS to S3 on EMR cluster, but Im getting this exception: 8/06/26 10:53:14 INFO mapreduce.Job: map 100% reduce 91% 18/06/26 10:53:14 INFO mapreduce.Job: Task Id : attempt_1529995855123_0003_r_000006_0, Status : FAILED Error: java.lang.RuntimeExc...
Waqar Ahmed
1

votes
0

answer
106

Views

Importing data into Neo4j from S3 bucket using authentication

I'm new to Neo4j and testing it on EC2 server, to see if we could use it for storing our ~1.5 nodes and their connection (currently using a Redshift). I want to load all the data from Redshift to the Neo4j DB. I also work a lot with EMRs and usually storing most of my data on S3. Is there any way to...
JustinCase
1

votes
0

answer
135

Views

AWS S3 proxy saves raw form-data instead of the actual file. Why?

Good morning to everyone. I am trying to directly upload a binary file (either a picture or an audio file) to a S3 bucket from a web browser by using AWS API Gateways, but avoiding lambda functions and signed urls following the guide on https://docs.aws.amazon.com/apigateway/latest/developerguide/in...
Fabio D.
1

votes
1

answer
110

Views

Cannot Archive Data from AWS Kinesis to Glacier

I am working on a Data processing application hosted as a web service on an EC2, each second a small data file (less than 10KB) in .csv format is generated. Problem Statement: Archive all the data files generated to Amazon Glacier. My Approach : As data files are very small. I store the files in AW...
1

votes
0

answer
191

Views

S3: Getting Access Denied for user with AmazonS3FullAccess permission policy

I have created an IAM user and applied a role that has granted the AmazonS3FullAccess and the AdministratorAccess, but when I try to upload a file to the budget I get: AccessDeniedAccess Denied
montjoile
1

votes
0

answer
302

Views

Import CSV data into aurora postgresql db

I have data for all the tables stored in CSV format in a S3 bucket. I wouldn't be able to use AWS Datapipeline. Is there a way to programmatically import this data into aurora postgresql db?
Punter Vicky
1

votes
0

answer
558

Views

SSL certificate error - Jupyter notebook

I'm using requests in Jupyter notebooks to make a connection to 'https://dynamodb.eu-west-3.amazonaws.com/ (Amazon AWS) & have been getting the error :- SSLError(SSLError('bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)',),)) It works with verify...
user3294008
1

votes
1

answer
246

Views

aws s3 cp: don't copy if file is already there

I have two requirements for copying files from one bucket to another: Logical: copy the website redirect from the source file to the destination file for each file. (If s3://bucket-src/x.txt has website redirect /foo.txt and s3://bucket-src/y.txt has website redirect /bar.txt, then the final result...
lf215
1

votes
1

answer
70

Views

Encrypting large streams to be sent via Amazon S3

I want to encrypt stream and then send it using Amazon S3. I'm using legacy code and have two important parameters: non-encrypted InputStream and its length. This is important as AmazonS3Client wants to know the length of the stream before it uploads it. Encrypting a stream is not very difficult tas...
menteith
1

votes
1

answer
65

Views

Aws X-ray daemon monitor throws invalid segment Errorcode

I am trying to run this demo sample by AWS :- https://github.com/awslabs/lambda-refarch-mapreduce When run the program on my local then I am receiving this message on my AWS X-RAY daemon monitor 2018-06-28T16:39:06+05:30 [Error] Unprocessed segment: { ErrorCode: 'InvalidSegment', Id: '20bc7ab3728074...
bhaskar das
1

votes
0

answer
27

Views

An issue accessing an AWS S3 object

I have an issue with getting an object from an AWS S3 bucket. It is at standard storage class. I can access and download it from the AWS console. But when I try to download it via awscli with the same user, it gives me an File is not found error. I'm 100% sure that the paths of the files are correct...
amr farid
1

votes
0

answer
114

Views

Postman call to get S3 Bucket Location Fails for regions other than “us-east-1”

In POSTMAN, I am using the below GET request to get the location of my S3 bucket. Request Type : GET API : https://mybucketname.s3.amazonaws.com/?location Authorization: I am choosing AWS Signature and I am passing the Access and secret keys. and given Service Name as s3. But the Problem i...
aravindp03
1

votes
0

answer
38

Views

How to generate in Laravel Authorization header for s3 delete operation from angular?

I have configured disk s3 in Laravel 5.6 and Angular 1.6 on client side. I already have /vendor/aws/aws-sdk-php in my Laravel. I need to generate only Header data in server side for this operation: DELETE /ObjectName HTTP/1.1 Host: BucketName.s3.amazonaws.com Date: date Content-Length: length Author...
nilecrocodile
1

votes
1

answer
41

Views

How to make a sync backup (aws)?

Currently I have two buckets in the same region. (50TB) I am trying to backup all the contents from one to the other through the sync command. But the sync command copies it doesn't really backup However after the sync finished I realized that the versionIDs in the objects in the new bucket are not...
Joze
1

votes
1

answer
533

Views

AWS [email protected] Access browser cookie in origin response triggered function

(My setup: CloudFront + S3 Origin) Hi everyone! This is what I’m trying to do: Step 1. Trigger a Lambda function on viewer request. Get cookie with user preferred language if available (this cookie is set when the user chooses site language). Step 2. Trigger a Lambda function on origin response. I...
Ernesto Stifano
1

votes
0

answer
161

Views

Lambda updateFunctionCode is ridiculously slow

I'm deploying my lambda function code through Lambda.updateFunctionCode (zip 44mb). It works fine for the first time (updates within 2 min). But after that when I re-upload the code with some changes, it doesn't work. Timeout error is - message: 'Connection timed out after 120000ms', code: 'TimeoutE...
Rahul
1

votes
1

answer
20

Views

AWS network traffic blocks during overnight backup of 50GB table

We download a backup of our RDS mysql to an AWS instance and then upload to S3. There is a single 50GB table in a DB on the host getting backed up. When this process starts, network traffic on our other AWS instances hangs (literally, network bound processes appear to block on writing to eg AWS-bas...
user1561108
1

votes
1

answer
173

Views

Javascript S3 GET Bucket (List Objects) Formatting?

I'm trying to get a xml response from AWS s3 as outlined in this doc: https://docs.aws.amazon.com/AmazonS3/latest/API/v2-RESTBucketGET.html I'm not sure how to integrate these paramiters: GET /?list-type=2 HTTP/1.1 Host: BucketName.s3.amazonaws.com Date: date Authorization: authorization string Into...
Mitch
1

votes
1

answer
126

Views

compressing font files in amazon s3

I am using Amazon S3 to serve static files for my website. My server side code is built on Django 2.0. I am using boto3 and django-storages packages to server static files. AWS_IS_GZIPPED is set to True in settings.py file. All the static files (CSS, JS, images) are compressed. Response header has c...
1

votes
0

answer
202

Views

OPTIONS request returning 403 on a presigned S3 url

I have an s3 bucket. The bucket has CORS enabled. When I access a resource via GET from the bucket via the browser with javascript the resource works and the options request is successful. When I presign a request via the AWS API I get a 403 returned on the options request for the presigned link. Th...
Stewart
1

votes
0

answer
39

Views

Do we get a degraded performance from S3/cloudfront after certain limit of requests?

I am building an application to read images (400kb on average) from the S3 bucket. The time taken to read them is gradually increasing to minutes. So I used a CloudFront to increase the performance. But even with the CloudFront enabled the time taken has increased after a certain number of requests....
Sainath
1

votes
0

answer
106

Views

Multer-s3 dynamic s3 instance

I'm trying to upload files to my s3 bucket, using multer and multer-s3 for Nodejs. The problem that I have now is that I want to set up my s3 instance dynamically because the s3 account and the bucket depend on my user settings. I have the following code: My uploader var uploader = multer({ storage:...
Brayan Serrano
1

votes
0

answer
87

Views

Is it possible to set host in fluentd s3_output

Is it possible to set somehow %{host} (not %{hostmane}: it is point to local fluentd server) in the fluentd S3 path like: s3://logs/2018/07/10/web01/misc-20180710-07_1.gz host is one of the message field 'host':'ip-10-78-46-14' @type s3 s3_bucket logs s3_region us-west-2 path %Y/%m/%d time_slice_for...
Andrii Petrenko
1

votes
0

answer
55

Views

Does Rusoto have the ability to pause and resume downloads from Amazon S3?

I am investigating the Rusoto library and couldn't find the API for pausing and resuming transfers. Are there any available options?
Akiner Alkan
1

votes
0

answer
33

Views

AmazonS3Encryption ignores EncryptionMaterials provided

I'm experimenting with retrieving an encrypted object from S3 (put there by SES). What I don't understand is no matter what paramater I provide to KMSEncryptionMaterialsProvider(x) , the object is unencrypted & retrieved successfully. Even providing a CMK that was not used by the encryption process...
eugenevd

View additional questions