Ad

Laravel S3 Adapter Invalid Argument Error

- 1 answer

I'm trying to upload Media files to Digital Ocean. Here is my code:

<?php

namespace App\Jobs;

use App\Entities\Media;
use Carbon\Carbon;
use Illuminate\Bus\Queueable;
use Illuminate\Queue\SerializesModels;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Support\Facades\Storage;

class UploadMediaToCloud implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    protected $media;

    /**
     * UploadMediaToCloud constructor.
     *
     * @param Media $media
     */
    public function __construct(Media $media)
    {
        $this->media = $media;
    }

    /**
     * Execute the job.
     *
     * @return void
     */
    public function handle()
    {
        $photoPath = $this->media->path . '/' . $this->media->name . '.' . $this->media->extension;
        $photoFullPath = env('APP_URL') . '/' . $photoPath;

        Storage::disk('spaces')->put($photoPath, file_get_contents($photoFullPath), 'public');

        $this->media->disk = 'spaces';
        $this->media->updated_at = Carbon::now();
        $this->media->save();
    }
}

and in my 'filesystems.php' config file:

'spaces' => [
        'driver' => 's3',
        'key' => env('DO_SPACES_KEY'),
        'secret' => env('DO_SPACES_SECRET'),
        'endpoint' => env('DO_SPACES_ENDPOINT', 'https://ams3.digitaloceanspaces.com'),
        'region' => env('DO_SPACES_REGION', 'ams3'),
        'bucket' => env('DO_SPACES_BUCKET')
    ]

This Job process perfectly works in local environment. (My .env files are identically the same with the prod) But is throwing:

InvalidArgumentException
Missing required client configuration options: 

region: (string)

A "region" configuration value is required for the "s3" service
(e.g., "us-west-2"). A list of available public regions and 
endpoints can be
found at http://docs.aws.amazon.com/general/latest/gr/rande.html.

this exception. I already tried to hardcode all credentials but so far nothing works.

Ad

Answer

Anyone who encounters this problem, try to update your Linux packages on the server. I believe there is some backwards compability issues on AWS SDK. The update solution worked for me.

Ad
source: stackoverflow.com
Ad