Questions tagged [amazon-s3]

1

votes
1

answer
900

Views

Using AWS services to create subdomains on the fly to host a site on S3

INFO: I am working on an app built in php which facilitates users to create HTML templates and publish them on web.As the templates would be static I thought of using amazon S3 for storing them as it can host static websites and has good infrastructure overall for the application.br/> QUERY: I am fa...
KillABug
4

votes
0

answer
65

Views

How to deploy a rules kjar to S3 bucket from within the kie workbench?

I am trying to use a S3 bucket as a remote maven repository via which to distribute the rules jar, created with the kie workbench UI, to where they are needed. If i package the rules with maven from my IDE, i can successfully upload the jar to s3 with the usage of Maven S3 wagon extension and distr...
Claude Sylvanshine
1

votes
1

answer
1.4k

Views

Programmatically setting up a static website using Amazon S3 and Route 53 APIs

Assume I already have purchased a domain example.com with IP address 203.0.113.2. Using C# and the The Amazon Web Services SDK for .NET 2.0.2.2, I'd like to create a static website using a custom domain using Amazon S3 and Route 53. The manual process is described in the Amazon documentation. Whe...
bloudraak
1

votes
1

answer
382

Views

Is there any difference on the url you use to Amazon S3 files?

I noticed that there are two urls to open my files on amazon s3 buckets: 1) http://BUCKETNAME.s3.amazonaws.com/FOLDER/FILE.jpg 2) http://s3-sa-east-1.amazonaws.com/BUCKETNAME/FOLDER/FILE.jpg Is there any difference on the way the files are accessed, charged or anything? Thanks.
Rogerio Chaves
0

votes
1

answer
61

Views

Why does AWS Lambda CFN S3-response returns 403 upon Delete event?

I'm using serverless to deploy an application where I use a Custom Resource to migrate a RDS database. Everything works while I deploy, but when I delete the stack the Custom Resource timeouts after an hour with the message "Custom Resource failed to stabilize in expected time.". The request to the...
kontrollanten
1

votes
1

answer
2.1k

Views

Get Absolute Path of Each S3 Object in Bucket

Given an AWSS3Client, how can I get a complete list of all S3 Objects' paths? Example: Bucket Name: foo has 5 objects foo/bip/baz foo/bip/bap foo/bar/1 foo/bar/2 foo/1234 I'd like to get a List[String] consisting of those 5 items. How can I do this?
Kevin Meredith
0

votes
0

answer
2

Views

AWS Glue writing a small dataframe to S3 taking too long

I have a glue job to perform some aggregations on json data and write the results to s3. The results file I'm trying to write is less than 1 KB. With 20 dpu's the time it's taking to write a single file is 5 mins which is not acceptable. Anyone knows what could be a possible reason for the slowness?...
user1527762
1

votes
2

answer
1.7k

Views

Amazon AWS S3 file upload. How to set file permissions?

I'm uploading a file using Amazon's AWS SDK (S3), and everything is working fine. Below is my code: final AWSCredentials credentials = new AWSCredentials() { @Override public String getAWSAccessKeyId() { return "...myAccessKey..."; } @Override public String getAWSSecretKey() { return "...mySecretKey...
1

votes
1

answer
1.3k

Views

Can I limit the size of an object put into S3 via the JavaScript API?

It is possible to use JavaScript APIs to upload objects to S3 and it is possible to have a fine-grain authorization using IAM policies. For instance, see this policy: { "Version": "2012-10-17", "Statement": [ { "Action": [ "s3:PutObject", "s3:PutObjectAcl" ], "Resource": [ "arn:aws:s3:::YOU...
Davide Vernizzi
0

votes
0

answer
2

Views

How to load a csv file into AWS Aurora Database (Relationional database) usign AWS Glue?

I have a csv file which will be present (daily new file) in S3 bucket. From here I am trying to use AWS Glue to extract, transform & load in AWS Aurora Database. Aurora DB is designed as a normalized relational database, I have to load the csv into this relational database with information mapped be...
Rajas
1

votes
1

answer
208

Views

Easiest way to configure a proxy for static Amazon S3 content and dynamic heroku content

My mobile app consists of a dynamic portion on heroku (foo.herokuapp.com) which serves up our API and web views for some content we expose to users who don't have the app installed. There is also of course a static landing page (http://foo.co) which is hosted on S3. Currently, I have DNS setup to r...
Bryan Alger
0

votes
0

answer
3

Views

How do I calculate S3 pricing for short-lived files?

If I understand correctly, S3 prices which say are GB/Month, are actually GB/hour right? So if I'm using S3 as a middleware between a client server and a file-generating lambda, my files would last 1-2 minutes, how can I properly calculate my costs then? The S3 calculator doesn't consider these vari...
Mojimi
0

votes
0

answer
2

Views

Hadoop S3A client support for object versions

Hadoop FileSystem supports fetching S3 objects using S3A connector with uri like "s3a://bucketName/keyName" Is there any way to access S3 object with specific versionId using s3a uri? I couldn't find anything about this in docs.
Tofig Hasanov
1

votes
2

answer
2.6k

Views

AWS S3 Access Denied on delete

I have a bucket that I can write to with no problem. However, when I try to delete an object, I get an error ... AccessDeniedException in NamespaceExceptionFactory.php line 91 Following the very basic example here, I came up with this command ... $result = $s3->deleteObject(array( 'Bucket' => $bucke...
Joshua Foxworth
1

votes
1

answer
1.5k

Views

Amazon S3 vs Dynamo DB

I am deciding between Dynamo DB and S3 as a storage solution. Current scenario Low storage requirement non-transactional mostly (The DB has grown to 15MB only since last 2 years and max I can hope it to grow in the next years is 50-100MB). Use cases I want to query this DB on multiple attributes (s...
Arushi
1

votes
2

answer
179

Views

How to avoid re-downloading media to S3 in Scrapy?

I previously asked a similar question (How does Scrapy avoid re-downloading media that was downloaded recently?), but since I did not receive a definite answer I'll ask it again. I've downloaded a large number of files to an AWS S3 bucket using Scrapy's Files Pipeline. According to the documentation...
Kurt Peek
1

votes
1

answer
217

Views

How to change all the folder files permission private into public in Digital Ocean spaces?

I have synced my AWS s3:bucket to digital ocean spaces. In my s3:bucket, my all folders/files permission is in Private- but anyone will read. After, the sync s3 - spaces, my all folders/files permission is in Private. Because of that anyone not able to read. So I want to make my all folders/files p...
Dhananjayan K
0

votes
0

answer
15

Views

Some images stored in s3 are not loading in Safari

So essentially I have a React app with a Node backend and I'm storing 650 or so images in an s3 bucket and everything works perfectly fine on Chrome, but in Safari about 10 images do not load with this message being logged in the console : My screen when loading s3 Image As you can see it says there...
mlisonek
1

votes
2

answer
1.4k

Views

How to upload to compress and upload to s3 on the fly with s3cmd

I just found my box has 5% for HDD hard drive left and I have like almost 250GB of mysql bin file that I want to send to s3. We have moved from mysql to NoSQL and not currently using mysql. However I would love to preserve old data before migration. Problem is I can't just tar the files in a loop be...
black sensei
0

votes
0

answer
4

Views

AWS instance automatically remove/delete Mysql database without user consent

AWS instance automatically remove/delete Mysql database without user consent. After setting up LAMP in aws ubuntu instance, i created db in mysql and have huge data. But after some random day (around 7-10) it was deleted by aws instance. i imported database again and still repeating same sequence.
MaYuR Solanki
0

votes
1

answer
7

Views

How to hide AWS S3 Bucket URL with custom CNAME

I want to connect CDN to an AWS S3 Bucket, but the AWS Document indicates that the bucket name must be the same as the CNAME. Therefore, it is very easy to guess the real s3 bucket url by others. For example, - My domain: example.com - My S3 Bucket name: image.example.com - My CDN CNAME(image.e...
L.Kong
1

votes
1

answer
9

Views

How to Store Large Python Dependencies on S3 (for AWS Lambda with Serverless)

I am using AWS Lambda to host a Python project, managing deployments using the Serverless framework, and have come up against the commonly-hit 50MB package storage limits. Until now, I've successfully split my requirements up per function using the serverless-python-individually and serverless-pytho...
Zac
1

votes
1

answer
420

Views

Composing a line reader from a buffered stream using python io

I am using python boto to interact with s3. The files I have on s3 are CSV's where I'd like to read lines from s3 using a buffer to bound memory usage. I was wondering if anyone had any way of composing python's io classes to achieve this? The goal is to have some sort of abstraction that is able...
dm03514
0

votes
1

answer
5

Views

Best AWS Storage Option for Exporting Directories as .zip Files?

I'm brand new to AWS products, ruby on rails, web development, and coding of any type. For my first project after a quick (and dirty) bootcamp, I'm trying to build a ruby-on-rails website that stores images and allows the user to download them as a zip file. I used the RubyZip gem to accomplish this...
Luke Rogers
1

votes
0

answer
14

Views

Best strategy to upload files with unknown size to S3

I have a server-side application that runs through a large number of image URLs and uploads the images from these URLs to S3. The files are served over HTTP. I download them using InputStream I get from an HttpURLConnection using the getInputStream method. I hand the InputStream to AWS S3 Client put...
polo
1

votes
1

answer
464

Views

Boto3 get only S3 buckets of specific region

The following code sadly lists all buckets of all regions and not only from "eu-west-1" as specified. How can I change that? import boto3 s3 = boto3.client("s3", region_name="eu-west-1") for bucket in s3.list_buckets()["Buckets"]: bucket_name = bucket["Name"] print(bucket["Name"])
lony
1

votes
1

answer
942

Views

Extract text from Nifi attribute

I'm listing out all the keys in S3 bucket. Below is the flow. Here in the keys as part of the filename attribute(FetchS3Object attributes) I have the complete path of the keys, out of which I want extract the last but one text e.g. If below is the complete path of the key /buckname/root1/subobject/s...
user805
1

votes
1

answer
590

Views

“Request has expired” when using S3 with Active Storage

I'm using ActiveStorage for the first time. Everything works fine in development but in production (Heroku) my images disappear without a reason. They were showing ok the first time, but now no image is displayed. In the console I can see this error: GET https://XXX.s3.amazonaws.com/variants/Q7MZrLy...
Manuel Frigerio
1

votes
1

answer
168

Views

AWS Glue Crawlers and large tables stored in S3

I have some general question about AWS Glue and its crawlers. I have some data streams into S3 buckets and I use AWS Athena to access them as external tables in redshift. The tables are partitioned by hour, some glue crawlers update the partitions and the table structure every hour. The Problem is...
flowoo
1

votes
1

answer
45

Views

Access AWS S3 from Lambda within Default VPC

I have a lambda function which needs to access ec2 through ssh and load files and save it to s3. So,for that I have kept ec2 and lambda both in default VPCs and same subnet. Now the problem is that I am able to connect the function to ec2 but not to s3. Its killing me since morning as when I remove...
Tanisha
1

votes
2

answer
33

Views

How to find un encrypted file in Amazon AWS S3 bucket?

What i have: several old s3 buckets with 1M objects in each, with server-side encryption turned on. Problem: old files are unencrypted. And i can't say when encryption was turned on. So, i need to find all unencrypted files. I've tried solution with awscli, but it is pretty slow - 1 request in 2...
Psychozoic
0

votes
0

answer
7

Views

Django command “python3 manage.py collectstatic --noinput” scrambling .py files when copying files to s3

I am trying to using Django collectstatic to send my script.py file to AWS S3. That file is then read by AWS Glue to execute Spark Job. When i manually upload the script.py, AWS Glue can read it properly. When i use the following command to send my script to AWS S3, the script.py gets scrambled pyth...
Arnab Biswas
0

votes
2

answer
142

Views

AWS SDK JS S3 getObject Stream Metadata

I have code similar to the following to pipe an S3 object back to the client as the response using Express, which is working perfectly. const s3 = new AWS.S3(); const params = { Bucket: 'myBucket', Key: 'myImageFile.jpg' }; s3.getObject(params).createReadStream().pipe(res); Problem is, I want to be...
Charlie Fish
1

votes
1

answer
23

Views

Python boto3: receive versionId after uploading a file to S3

I want to upload a file to a version-enabled S3 bucket and need its version number. Ideally, without a separate API call to avoid any possibility of a race condition. I'm using the following code snippet for upload (which is working fine): s3 = boto3.client("s3") s3.upload_fileobj(file_handle, bucke...
SaturnFromTitan
1

votes
0

answer
11

Views

S3 PUT request from microsoft edge and IE fails because of semicolon in parameter

This issue might be related to this question, but the platform is different so I am not sure. I had written JavaScript code in angularJS to upload an image to s3, it seems to work fine on most of the modern browsers but fails on Microsoft edge and IE 11 as far I have tested. The code does a PUT cal...
Xavitoj Cheema
0

votes
0

answer
8

Views

AWS SDK PHP - How to load multiple files with one link? How to create a zip file?

AWS PHP SDK I have a bucket, and there are folders with the following files: folder1 -- photo1 -- photo2 -- photo3 -- photo4 folder2 -- pic1 -- pic2 -- pic3 -- pic4 How can I create a download link for these files: photo1, photo4, pic2? Don't download files on a separate link. One more que...
1

votes
2

answer
2.9k

Views

AWS S3 returns “Identity pool id not found”

I'm using xcode 6.4 for my project. The problem is I got some issue regarding to AWS S3 (Amazon Web Services S3). What I need to do is download the file from the bucket. It said Identity pool id not found but already check it several of times, same exactly inside the console. I got this error messag...
Mohammad Nurdin
1

votes
1

answer
1.8k

Views

Amazon S3 - Multiple keys to one object

I have an S3 bucket with more than 100 million objects in it an each object has a unique key as usual. I was wondering if there is a way to assign another key to some of these objects. Something like this: Key1 ---> Object1 Key2 ---> Object2 Key3 ---> Object2 (I'd like to add this) I looked this up...
Seckin Tozlu
1

votes
2

answer
310

Views

best way to migrate billions of files on a single partition in a data center to s3?

We have a data center with a 10G direct connect circuit to AWS. In the data center, we have an IBM XIV storage infrastructure with GPFS filesystems containing 1.5 BILLION images (about 50k each) in the single top level directory. We could argue all day about how dumb this was, but I'd rather seek ad...
godeatgod
1

votes
3

answer
1.5k

Views

Stubbing S3 uploads in Node.js

How would I go about stubbing S3 uploads in Node.js? For insight, I'm using Mocha for tests and Sinon for stubbing, but I'm open to changing anything. I have a file that exports a function that performs the upload. It looks like this: var AWS = require('aws-sdk'); var s3 = new AWS.S3({ params: { Buc...
Baub

View additional questions