I’m trying to upload website files using an FTP client, but I keep running into connection errors and incomplete uploads. I’m not sure if my FTP settings, ports, or transfer mode are wrong. Can someone explain the correct steps and settings to reliably upload files via FTP so my site updates properly?
How I Actually Upload Files With An FTP Client (macOS)
People keep asking how to get files onto a server without using some sketchy web panel, so here is how I usually do it with an FTP client on a Mac. Nothing fancy, just the stuff that actually works and doesn’t make you want to throw your laptop.
Step 1: Get Your FTP Details From The Server
Before you even touch an app, you need:
- Hostname or IP (something like
ftp.yourdomain.comor123.45.67.89) - Username
- Password
- Port (usually
21for FTP,22for SFTP) - Protocol type: FTP, FTPS, or SFTP (SFTP is usually the better choice if available)
If you don’t know these, they’re usually in your hosting dashboard or in the “Welcome” email your host sent you when you signed up.
Step 2: Pick An FTP Client That Doesn’t Fight You
You can absolutely use some of the old, classic FTP clients out there. They do the job, but a lot of them feel like they were designed when flip phones were still cool.
On macOS I eventually settled on using a file manager that has FTP, SFTP, WebDAV and all that built in, instead of juggling multiple apps. The one I’ve ended up using is Commander One:
The nice part is that it looks like a normal dual‑pane file manager, but one of those panes can be your remote server, and the other is your Mac. So uploads basically boil down to “drag from left to right.”
Use whatever you like, but the general steps are the same in almost every client.
Step 3: Add A New FTP / SFTP Connection
In whatever client you use, there’s usually a “New Connection” / “Site Manager” / “Add Server” option.
Typical fields you’ll need to fill out:
- Protocol: FTP, FTPS, or SFTP
- Host: your server name or IP
- Port: often auto-filled, but you can set it manually
- Username / Password: the credentials from your host
- Optional: save the connection so you don’t have to type this again
In Commander One, there’s a connections manager where you can add a new FTP or SFTP entry, give it a name like “MySite Prod,” and save it. After that it’s one click to reconnect.
Step 4: Connect To The Server
Hit “Connect” or “OK” or whatever the button is in your app.
If it’s the first time connecting:
- You might see a warning about an unknown SSH key or certificate (for SFTP/FTPS).
- Usually you just verify it’s your server and accept it so the app remembers it.
Once connected, you’ll see the folders on your server. Common ones are:
public_htmlwwwhttpdocs- or a custom folder your host mentions
That’s typically where your website or app files live.
Step 5: Upload Files (The Actual Part You Came For)
This is the easy bit:
- On one side: open your local folder (the files on your Mac).
- On the other side: open the remote folder (on your server).
- Select the files or folders you want to upload.
- Drag them from your Mac side to the server side, or right‑click and choose “Upload.”
Most FTP clients will:
- Show you a transfer queue
- Let you pause / resume transfers
- Retry failed files
In Commander One, you can just drag from one pane to the other, and there’s a small queue/toolbar where you can see what’s going on. It feels like moving files between two drives rather than “using FTP,” which is kind of the point.
Step 6: Check Permissions And Structure
After uploading, it’s worth checking:
- Files ended up in the right folder (e.g., your
index.phporindex.htmlis inpublic_html). - Executable scripts and folders have reasonable permissions (
755for folders,644for files is pretty standard).
Most FTP clients let you right‑click a file and adjust permissions (CHMOD) if something isn’t loading correctly.
Step 7: Edit / Replace Files Later
Once you’ve got the connection saved, future changes are painless:
- Connect to the server
- Open your site folder
- Drag updated files over the old ones
- Confirm “Overwrite” when asked
In tools like Commander One, you can also:
- Quickly compare what’s on your Mac versus what’s on the server
- Rename, move, or delete remote files like you would locally
- Mount multiple servers at once if you juggle several projects
It’s less “log into FTP and fight with it” and more “treat remote files almost like they’re on another disk.”
Summary
Uploading via FTP/SFTP is basically:
- Get host, username, password, port, and protocol from your provider
- Add a new connection in your FTP client
- Connect and open the correct remote folder
- Drag files from your Mac to the server
- Check that everything landed where it should
If you live inside macOS Finder and hate switching context just to move files to a server, using something like Commander One (Commander One: File Manager App - App Store) as both a file manager and FTP/SFTP client can simplify a lot. One window, two panes, and your server behaves a bit like another drive attached to your Mac.
Couple of different issues mixed together here: connection errors and incomplete uploads are usually different root causes. I’ll try to keep this practical and not rehash what @mikeappsreviewer already covered about the basic workflow.
1. First sanity check: are you using FTP, FTPS or SFTP?
A lot of hosts now:
- Disable plain FTP on port 21
- Force SFTP on port 22
- Or require FTPS on 21
If you’re getting “connection refused” or endless timeouts:
- Try SFTP on port 22 with the same username/password
- If that fails, check your host’s docs or dashboard for the exact protocol/port
- Avoid plain FTP if possible, it breaks more and is insecure anyway
Also, some firewalls block port 21/22 on public Wi‑Fi or corporate networks. If it suddenly works from your phone hotspot, that’s your clue.
2. Active vs Passive mode (this one kills a lot of connections)
This sounds boring, but it’s often the real problem.
- In your FTP client settings, find Transfer mode or Connection mode
- Switch to Passive (PASV) mode if it is on Active
- Try again
With NAT routers, VPNs, hotel Wi‑Fi, etc., Active mode is pretty much a nightmare. Passive mode fixes random timeouts and listing errors for many people.
I actually disagree slightly with the idea that the client is the main thing to worry about; nine times out of ten it’s this passive/active setting plus a firewall.
3. Incomplete uploads & corrupt files
If files upload but end up truncated or broken:
-
Check transfer type (ASCII vs Binary)
- Code/text: ASCII is fine
- Anything else (images, zips, PDFs, fonts, etc.): Binary
Most modern clients use “Auto” which is usually smart enough, but if you keep getting corrupted images or zips, force Binary.
-
Limit simultaneous connections / transfers
- In settings, look for “Max simultaneous transfers” or “Concurrent connections”
- Set it to 2 or 3, not 10
Some cheap hosting dies when you hammer it with lots of parallel uploads and silently drops files.
-
Enable resume & verify
- Many clients have “Resume interrupted transfers” and sometimes a checksum/compare option
- Use that if you’re on a flaky network
4. Ports & security settings
For quick testing, try:
- SFTP, port 22
- If that fails, FTPS (explicit), port 21
- Only use plain FTP if the host says that’s all they offer
If you keep getting TLS or certificate errors with FTPS:
- Make sure your system clock is correct
- Accept the cert once if it’s your host’s standard cert
- If it keeps changing every connect, that’s sketchy
5. Folder structure problems disguised as “FTP issues”
Sometimes “it doesn’t work” is really “I uploaded to the wrong path.”
Check:
- Are you inside
public_html,www,httpdocs, or whatever your host says is the web root? - Does
index.htmlorindex.phpactually sit in that root, not one level too deep likepublic_html/myproject/public_html?
A quick test: upload a file called test.txt into what you think is the root, then try https://yourdomain.com/test.txt. If 404, you’re in the wrong folder.
6. Timeouts & random disconnects
If you keep getting “Connection timed out” during longer uploads:
- In the client, increase timeout to 60–120 seconds
- Disable “timeout on idle” or keep-alive junk if it’s too aggressive
- For big folders, consider zipping them first, upload the zip, then unzip on the server (via SSH or your host’s file manager)
7. Client choice (briefly)
I’m not as picky as @mikeappsreviewer, but I do agree a decent client saves headaches. If you’re on macOS and want something that behaves like a dual-pane file manager with SFTP built-in, Commander One is actually solid. Nice part is you see local on one side, server on the other, so you’re less likely to drag into the wrong folder and it handles queues and resume pretty well.
8. Quick checklist you can run through
- Confirm protocol & port from host (FTP/FTPS/SFTP, 21 vs 22)
- Set Passive mode
- Use Binary transfer or Auto
- Limit concurrent transfers to 2–3
- Increase timeout to 60–120s
- Make sure you’re in the correct web root folder
- Try from a different network to rule out firewall
If you post the exact error message your client shows (minus passwords, obviously), plus which protocol/port you’re using, people can usually point to the exact mis-setting in one reply.
You’re not crazy, FTP really is this annoying sometimes.
Since @mikeappsreviewer already walked through the “how to upload” basics and @espritlibre hit the usual suspects (SFTP, passive mode, binary, etc.), I’ll focus on how to actually debug what’s going wrong instead of just flipping random settings and praying.
1. Turn on detailed logs in your FTP client
Most people skip this and it’s where the real answers are.
- In your client, enable verbose / debug / detailed log.
- Try to connect and start an upload.
- Look for the first actual error code, like:
530 Login incorrect421 Too many connections425 Can't open data connection550 Permission denied226 Transfer completebut file is smaller than it should be
Post that exact line if you want help; everything else is just guesswork.
2. Rule out your network first
Stuff @espritlibre said about firewalls is spot on, but I’d actually start here before messing with transfer types:
- Try from a different network (phone hotspot, neighbor’s Wi‑Fi).
- If it suddenly works:
- your ISP, router, or office firewall is blocking ports or FTP data connections.
- If possible, temporarily disable:
- VPN
- “Secure DNS” / parental controls / weird “internet security” suites
This alone can explain connection timeouts and directory listing failures.
3. Check server-side limits (the thing no client can fix)
Incomplete uploads are often the host silently killing the connection:
- Shared hosting can have:
- Max file size limits
- Process / connection limits
- Auto-kill if too many concurrent connections
- Symptoms:
- Large files stop at the same size every time.
- Upload silently ends with no clear error.
If that happens:
- Try a tiny test file, like a 1 KB txt file.
- Then try a medium file (5–10 MB).
- Then a big one (100+ MB, if relevant).
If only the big ones fail, that’s not your client, that’s the server or PHP/Apache limits. You’d need to check your host’s docs or open a ticket.
4. Narrow it down with a simple test folder
Instead of throwing your entire website at the server and hoping for the best:
- Create a folder on your machine called
ftp-test. - Put in:
index.html(simple “hello world” text)- one jpg image
- one zip file
- Upload that folder only.
- Verify:
- Does
index.htmlload via browser? - Is the jpg viewable and not corrupted?
- Does the zip extract properly on your machine after downloading it back from the server?
- Does
If the simple set works but the full site fails:
- You’re hitting size/timeout/connection count issues.
- At that point, limit simultaneous transfers to 1–2 and upload in chunks.
5. Double check you are not fighting your own cache
Weird one, but it bites a ton of people:
- You upload a new file.
- It “doesn’t change” in the browser.
- You reupload five times, think FTP is broken.
Things to clear:
- Browser cache (or open site in incognito).
- Any CDN / caching layer your host gives you.
- If you use Cloudflare or similar, purge cache there too.
FTP did its job, the cached version is lying to you.
6. A slightly different tool approach
Since you’re struggling a bit with the client / settings, I’d honestly simplify your setup while debugging:
- On macOS, Commander One is actually a decent option for this:
- Dual pane: left is your local files, right is the server.
- Connect with SFTP if your host allows it.
- Its transfer queue and error reporting are pretty clear, so you can actually see which files fail instead of a vague “something broke.”
I don’t totally agree with the idea that “any client is fine.” Some of the old-school ones hide important options or logs behind terrible menus, which just wastes your time. Using something that behaves like a normal file manager while you troubleshoot can make this less painful.
7. Concrete checklist to run in this order
Try this sequence, top to bottom:
- Enable detailed logging in your FTP client.
- Switch to SFTP on port 22 if your host supports it.
- Set Passive mode (if you stick with FTP/FTPS).
- Set transfer mode to Binary or Auto.
- Limit concurrent transfers to 1 or 2.
- Test from a different network (hotspot).
- Upload the small
ftp-testfolder instead of your whole site. - Verify files by downloading them back and comparing sizes.
If you share:
- protocol you’re using (FTP / FTPS / SFTP),
- port,
- and the exact first error line from your log,
it’s usually possible to point out the exact misconfig in one reply instead of just “try passive mode?” again and again.
If connection errors and half‑uploaded files keep happening even after what @espritlibre, @hoshikuzu and @mikeappsreviewer suggested, focus less on “which button” and more on how your client behaves under bad conditions.
A few angles that do not just repeat what they already covered:
-
Treat FTP as unreliable by default
Assume connections will drop and configure your client to survive that.- Reduce simultaneous transfers to 1–2, especially on cheap shared hosts.
- Enable automatic retries with a short delay instead of hammering the server.
- Turn on “resume broken transfers” and test it with a large file.
-
Use SFTP as the default, classic FTP only as a fallback
I actually disagree slightly with relying on plain FTP at all. If SFTP is available, use it first: you avoid a lot of passive/active port nonsense because it runs over a single SSH connection. Only drop back to FTP/FTPS if your host explicitly requires it. -
Binary vs ASCII is a data integrity issue, not just a ‘setting’
People often gloss over this. If images or zips are corrupt after upload, it is almost always the wrong transfer type. Force Binary for everything. “Auto” works most of the time but I have seen hosts misdetect. -
Watch the directory depth and file count
Some hosts choke on uploads with thousands of tiny files (big JS frameworks, node_modules, etc.). Symptoms look like random “timeout” or “failed to list directory.”- Zip your project locally.
- Upload a single zip.
- Unzip on the server (via SSH or the host’s file manager).
If that works, the problem is connection overhead, not your settings.
-
Client choice actually matters here
This is where I slightly part ways with the “any client is fine” idea. Different clients handle flaky connections differently.- A dual‑pane file manager such as Commander One helps because you see local and remote side by side and it exposes the transfer queue clearly.
- Pros of Commander One:
- Feels like normal file management, not a weird separate tool.
- SFTP, FTP and others in one place.
- Good for dragging whole folders and watching what fails.
- Cons of Commander One:
- macOS only, so not helpful if you are on Windows or Linux.
- Some advanced sync/compare features are not obvious for beginners.
Compared to what @mikeappsreviewer described, the real value in a client like that is not just “drag & drop” but also seeing per‑file errors quickly, which is what you need when uploads are incomplete.
-
Isolate where the failure happens
Instead of guessing:- Upload a single medium‑sized file (say 20–50 MB).
- Note the size locally vs on the server.
- If it stops at the same size every time, that is likely a host/network limit.
- If it stops at random sizes, that is more often a flaky line, Wi‑Fi issue, or idle timeout on the server.
-
Use different tools to cross check
If Commander One or your current GUI client is problematic, try a simple command‑line SFTP from the same machine:- If CLI succeeds consistently but the GUI client fails, the issue is client config.
- If both fail in the same way, it is host, network, or credentials.
-
Do not ignore server‑side logs
Everyone stares at client logs but the server logs often say “connection closed by timeout” or “too many connections from this IP.” If your host gives you access to FTP or SSH logs, scan them around the time of your failed upload.
If you post the exact protocol (FTP / FTPS / SFTP), the port, plus the first real error message from your client’s log, you can usually pinpoint one specific cause instead of chasing every possible setting.