Migrating an existing app to the Laravel Cloud
At the 2nd meetup in 2025 of the Laravel Switzerland Association, I had the pleasure to share my learnings on moving an existing app into the new Laravel Cloud.
I was among the lucky ones to get early access to the Laravel Cloud in December 2024. Of course, I wasn't ready to push my bigger production projects yet, but I started playing around with a few projects.

What will we cover?
- Migrate from MySQL to Postgres
- Move from disk to S3-compatible object storage
- From local to third-party services
Migrate from MySQL to Postgres
Check your custom queries
Postgres is very similar, but not exactly. The pgsql
driver handles many cases for you. For example: $query->whereLike()
uses ILIKE
in Postgres by default to be case-insensitive.
Some queries are a bit harder, here is an example to group all projects by status in a defined order:
$query = auth()->user()->is_admin
? Project::query()->orderByRaw(
DB::getDriverName() === 'mysql'
? 'FIELD(status, "in review", "published", "archived")'
: "CASE
WHEN status = 'in review' THEN 1
WHEN status = 'published' THEN 2
WHEN status = 'archived' THEN 3
ELSE 4
END"
)
: auth()->user()->projects();
return $query;
Make sure your packages support Postgres
spatie/laravel-tags
package work fine on Postgres.Currently, I'm running a quick and dirty fork of the laravel-tags package because they use custom queries for their json columns. Eventually, they will start supporting Postgres or I will simply build my own tagging implementation.
Moving the database from MySQL to Postgres using Query Builder
Yes, the Laravel Cloud does support MySQL now, so you probably don't plan to migrate your database driver. But, because I'll be using pgvector
in the future, I wanted to give it a try.
There are countless tools and ways to move your database content from one driver to another. On the Flare blog, they wrote about a way to use Laravel's Query Builder. This was a good starting point, but I needed some additional features:
- Reset the sequence for tables with auto-incrementing primary keys
- Migrate tables without primary keys (like pivot tables)
Here is the command I ended up using to migrate between different database drivers:
Using the command
Create an additional database connection, I called it mysqlsource
. Make sure your migrations have run on the new destination database. Here is how you use the command
# Signature
app:migrate-database {source} {destination} {table} {--chunk=50} {--from=}
# Example usage
php artisan app:migrate-database mysqlsource pgsql users
About sequences in Postgres
Initially, I didn't know about the difference in auto-incrementing sequences in Postgres. When no new projects came in, I checked the logs. Oops, there were some SQL errors about keys already taken.
Lucky for me, Aaron Francis has some free videos in his course about Postgres:

Here is what I learned about sequences in Postgres. Don't worry, the command above already got you covered:
-- Get last value
SELECT last_value FROM <table>_id_seq;
-- Get sequence information
SELECT * FROM <table>_id_seq;
-- Get next value without incrementing
SELECT nextval('<table>_id_seq');
-- Get current max ID from the table
SELECT MAX(id) FROM <table>;
-- To fix this
SELECT setval('<table>_id_seq', (SELECT MAX(id) FROM <table>))
Migrate from disk to S3-compatible Object Storage
I have a confession to make: this is the first time I'm using an S3-compatible object storage. Before, I always used the disk storage on my servers. For another project, I knew the clock was ticking as disk storage was getting lower over time.
As with all things Laravel, the setup was quite painless and straight-forward to implement. Because Laravel Cloud didn't offer storage buckets when I first moved Wire in the Wild, I used the Hetzner Object Storage and will most likely keep using it because it's just so affordable.
Another migration, another artisan command
To move the files from the disk to an S3-compatible object storage, I created another artisan command and set up a source and destination file system in my Laravel app.
As you can see, we are using the readStream
and writeStream
which should help handle larger files in a more memory efficient way.
I moved up to 500 files per directory without any problem.
Using the command
Just as a reminder, make sure to create your source and destination file systems in your Laravel app, before using the command.
# Signature
storage:mirror {source-disk} {destination-disk} {--directories=*}
# Example usage
php artisan storage:mirror local s3 --directories=avatars,screenshots
From local to third-party services
As developers, we tend to implement everything ourselves and it's not necessarily a bad choice. For example, to create the social share images for this project, I used the Browsershot package by spatie, which uses Puppeteer under the hood.
Here is the tutorial:

At the time of the move, I couldn't get Puppeteer to work on Laravel Cloud. Luckily, Stefan Zweifel has a great service for that and sent me the cutest coupon code to use the service for free. So please check out screeenly for all your screenshot and HTML-to-PDF needs:

Should you move to the Laravel Cloud?
Video from the talk
Slides from the talk
Let's share our stories using #wejustship
Please share your Laravel Cloud stories using the hashtag #wejustship.