Laravel ·

Using Production Data for Local Development in Laravel

Learn how to safely use production database dumps for local Laravel development with an automated Artisan command.


In a previous post, I shared how I create backups of my PostgreSQL database and store them securely. Today, I want to show how I leverage those backups for local development.

Even though I create factories and seeders for my applications, nothing beats working with real production data. It makes the application feel more authentic and helps uncover edge cases you might not consider otherwise. You also get those “what were they thinking?” moments that only real user data can provide.

The entire setup is handled by a single Artisan command:

php artisan test:setup-environment

When I run this command, the following happens:

  • The database is wiped using migrate:fresh
  • The newest SQL dump from /database/sql is imported
  • The schema is reset to public
  • Migrations are run on top of the imported data
  • All user passwords are reset

Here’s how I handle it: I store my database dumps in the /database/sql folder. Make sure to add this folder to .gitignore so you don’t accidentally commit sensitive data. I have a script that automatically finds the latest dump and imports it into my local database.

The best part? After the import, the script resets all users’ passwords to password. This makes it easy to log in as any user and test features quickly without needing to reset individual passwords or create new test accounts.

use App\Models\Scopes\TenantScope;
use App\Models\User;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Artisan;
use Illuminate\Support\Facades\DB;
use Illuminate\Support\Facades\Hash;

class SetupEnvironment extends Command
{
    protected $signature = 'test:setup-environment';
    protected $description = 'Command description';

    public function handle()
    {
        Artisan::call('migrate:fresh');

        $folderPath = database_path('sql'); // folder containing your SQL dumps

        // Get all .sql files
        $files = glob($folderPath . '/*.sql');

        if (empty($files)) {
            $this->error("No SQL files found in {$folderPath}");
            return 1;
        }

        // Sort files by name descending
        rsort($files, SORT_NATURAL);

        $latestFile = $files[0];
        $this->line('[' . now()->format('Y-m-d H:i:s') . "] Running latest SQL file: " . basename($latestFile));

        $connection = config('database.connections.pgsql');
        $db = $connection['database'];
        $user = $connection['username'];
        $pass = $connection['password'];
        $host = $connection['host'];
        $port = $connection['port'] ?? 5432;

        // Use escapeshellarg to handle spaces
        $command = sprintf(
            'PGPASSWORD=%s psql -h %s -p %s -U %s -d %s -f %s > /dev/null 2>&1',
            escapeshellarg($pass),
            escapeshellarg($host),
            escapeshellarg($port),
            escapeshellarg($user),
            escapeshellarg($db),
            escapeshellarg($latestFile)
        );

        $output = [];
        $returnVar = null;

        exec($command, $output, $returnVar);

        if ($returnVar !== 0) {
            $this->error("SQL script failed:\n" . implode("\n", $output));
        } else {
            $this->line('[' . now()->format('Y-m-d H:i:s') . "] SQL script executed successfully.");
        }

        $this->line('[' . now()->format('Y-m-d H:i:s') . '] Resetting schema to public...');
        DB::statement('SET search_path TO public');

        $this->line('[' . now()->format('Y-m-d H:i:s') . '] Running migrations...');
        Artisan::call('migrate');

        $this->info(Artisan::output());

        $this->line('[' . now()->format('Y-m-d H:i:s') . '] Resetting passwords');
        $hashedPassword = Hash::make('kode123');
        User::withoutGlobalScope(TenantScope::class)
            ->update(['password' => $hashedPassword]);

        return 1;
    }
}

This setup has multiple advantages:

  • You work with realistic data, which helps catch bugs early.
  • You can test workflows across different user roles easily.
  • It speeds up development because you don’t have to manually create test accounts.

Of course, you need to be mindful of privacy and security. Depending on your application, you might want to anonymize sensitive user data before importing it locally. I have a TODO in my script for implementing anonymization, which ensures compliance with privacy regulations.

Using production data for local development isn’t just convenient—it’s educational. Seeing how real users interact with your application exposes hidden patterns and edge cases that no seeder could mimic. By combining automated database imports, migrations, and password resets, I can spin up a realistic development environment in minutes and focus on what really matters: building great features.

In short, leveraging production data locally gives you a powerful, realistic, and efficient way to develop your Laravel applications without losing sight of security and privacy.