ERROR 1406 (22001): Data Too Long for Column in MySQL

Error Message

ERROR 1406 (22001): Data too long for column 'column_name' at row 1

What Triggers This Error

MySQL 1406 fires when an INSERT or UPDATE tries to store a value that exceeds the target column's declared length, and sql_mode includes STRICT_TRANS_TABLES or STRICT_ALL_TABLES. Without strict mode, MySQL silently truncates the value and issues a warning instead — which is why this error often appears suddenly after a server upgrade or sql_mode change. The fix depends on why the value is too large:

  • Value exceeds the declared VARCHAR(N) length — a 50-character string for a VARCHAR(20) column
  • Multi-byte charset doubles the storage needed — an emoji in a utf8mb4 column counts as 4 bytes per character
  • Data migration from a larger to a smaller column type — moving data from TEXT to VARCHAR(255) without auditing
  • sql_mode switched from non-strict to strict — values that used to truncate silently now raise 1406
  • Bulk import exceeds column limit — LOAD DATA INFILE stops at the first oversized row

Fix by Scenario

Value exceeds declared VARCHAR(N) length

The most common case. Someone inserts a long string into a column that was sized for shorter values.

CREATE TABLE users (
  username VARCHAR(20) NOT NULL
);

INSERT INTO users (username) VALUES ('a_very_long_username_that_exceeds_20_chars');
-- ERROR 1406 (22001): Data too long for column 'username' at row 1

Fix:

  1. Check the actual length of the offending value:
SELECT LENGTH('a_very_long_username_that_exceeds_20_chars');
-- 42
  1. Decide whether to expand the column or truncate the input:
-- Option A: Expand the column to accommodate expected values
ALTER TABLE users MODIFY COLUMN username VARCHAR(100);

-- Option B: Truncate in application code before insert
-- Python example
username = user_input[:20]
cursor.execute("INSERT INTO users (username) VALUES (%s)", (username,))

Expanding VARCHAR(20) to VARCHAR(100) is an online operation in MySQL 8.0+ and usually fast. Prefer expanding when the column's business meaning allows longer values — truncating loses data.

Multi-byte charset doubles storage

A VARCHAR(10) column stores 10 characters, not 10 bytes. But with utf8mb4, each character can take up to 4 bytes. If you migrated from utf8 (3-byte) to utf8mb4 (4-byte), existing column sizes may no longer fit certain inputs.

CREATE TABLE messages (
  content VARCHAR(10) CHARACTER SET utf8mb4
);

INSERT INTO messages (content) VALUES ('😀😀😀😀😀😀😀😀😀😀😀');
-- ERROR 1406 (22001): Data too long for column 'content' at row 1
-- 11 emoji, but column holds 10

Fix: Verify you're counting characters, not bytes. MySQL's VARCHAR(N) counts characters, but the underlying limit is bytes (65,535 per row). For short columns with emoji, expand the column to accommodate:

ALTER TABLE messages MODIFY COLUMN content VARCHAR(100) CHARACTER SET utf8mb4;

For high-volume tables, consider whether 4-byte characters are actually needed in that column. If the column only stores ASCII (e.g., slugs, identifiers), keep it as utf8mb4 but validate input:

-- Add a generated column to enforce ASCII-only input
ALTER TABLE users
  ADD COLUMN username_ascii CHAR(1) GENERATED ALWAYS AS (
    IF(username REGEXP '^[[:ascii:]]+$', 'y', 'n')
  ) VIRTUAL;

-- Or enforce via CHECK constraint (MySQL 8.0.16+)
ALTER TABLE users
  ADD CONSTRAINT username_ascii_only CHECK (username REGEXP '^[[:ascii:]]+$');

Data migration from a larger to a smaller column type

During a schema refactor, someone might shrink a TEXT column to VARCHAR(255) or consolidate columns. If existing rows contain values longer than the new limit, the ALTER TABLE itself can fail with 1406 on one of the rows.

-- Suppose bio is currently TEXT, with some rows containing 2,000+ characters
ALTER TABLE users MODIFY COLUMN bio VARCHAR(255);
-- ERROR 1406 (22001): Data too long for column 'bio' at row 1

Fix: Audit before shrinking. Find rows that would fail:

SELECT id, LENGTH(bio) AS bio_length
FROM users
WHERE LENGTH(bio) > 255
ORDER BY bio_length DESC
LIMIT 20;

Then decide:

  1. Truncate the offenders before the ALTER:
UPDATE users SET bio = LEFT(bio, 255) WHERE LENGTH(bio) > 255;
ALTER TABLE users MODIFY COLUMN bio VARCHAR(255);
  1. Or split into two columns (short summary + optional full text):
ALTER TABLE users
  ADD COLUMN bio_summary VARCHAR(255) GENERATED ALWAYS AS (LEFT(bio, 255)) STORED;

Never run a shrinking ALTER on a large table without first verifying row lengths — it can run for hours and then fail mid-operation.

sql_mode switched from non-strict to strict

MySQL 5.7+ defaults to strict mode. If you inherited a database from MySQL 5.6 or earlier (or a server with non-default sql_mode), the application may have been silently relying on truncation.

-- Check current sql_mode
SELECT @@sql_mode;
-- STRICT_TRANS_TABLES,ERROR_FOR_DIVISION_BY_ZERO,NO_ENGINE_SUBSTITUTION

-- Previously was:
-- NO_ENGINE_SUBSTITUTION  (non-strict)

Fix: Do NOT disable strict mode as a fix — strict mode catches real data integrity bugs. Instead, find and fix the oversized inputs:

-- Temporarily disable strict mode for the current session to diagnose
-- (use ONLY for investigation, not production)
SET SESSION sql_mode = REPLACE(REPLACE(@@sql_mode, 'STRICT_TRANS_TABLES,', ''), 'STRICT_ALL_TABLES,', '');

-- Run the failing inserts
INSERT INTO users (username) VALUES ('...');
SHOW WARNINGS;
-- +---------+------+---------------------------------------------+
-- | Level   | Code | Message                                     |
-- +---------+------+---------------------------------------------+
-- | Warning | 1265 | Data truncated for column 'username' at row 1 |

-- Re-enable strict mode and fix the actual input
SET SESSION sql_mode = DEFAULT;

The long-term fix is input validation at the application layer plus column types that match real data distributions.

Bulk import with one oversized row

LOAD DATA INFILE and mysqldump restores fail immediately on the first 1406, leaving the import partially complete.

mysql -u root -p mydb < dump.sql
# ERROR 1406 (22001) at line 4283: Data too long for column 'description' at row 1

Fix: Find the offending row in the dump file before re-running:

# Rough heuristic: split the dump on ';' and flag any INSERT mentioning the description
# column whose statement text is unusually long (statement length, not field length).
# Use this to locate suspect rows, then inspect them to confirm the offending value.
awk -v RS=';' '/INSERT INTO.*description/ && length > 10000' dump.sql | head

Or import into a wider staging table first, then migrate:

-- Staging table with TEXT column
CREATE TABLE products_staging LIKE products;
ALTER TABLE products_staging MODIFY COLUMN description TEXT;

-- Import into staging
LOAD DATA INFILE '/tmp/products.csv' INTO TABLE products_staging;

-- Migrate with truncation (or audit first)
INSERT INTO products
SELECT id, name, LEFT(description, 500) FROM products_staging;

For production imports, always pre-validate data against the target schema — LOAD DATA failures halfway through are painful to reconcile.

Prevention

  • Keep sql_mode strict — truncating strings silently hides real bugs
  • Size VARCHAR columns to match actual data distribution, not guessed maximums (run SELECT MAX(LENGTH(col)) FROM table before picking a size)
  • For utf8mb4 columns, remember that 1 character can take up to 4 bytes when budgeting row length against MySQL's 65,535-byte row limit
  • Validate input length in application code before sending to the database — return a useful error to the user instead of bubbling up 1406
  • Before any ALTER TABLE ... MODIFY COLUMN that shrinks a column, run a SELECT MAX(LENGTH(col)) audit
  • For bulk imports, load into a staging table with relaxed types, then validate and migrate
  • Use generated columns or CHECK constraints to enforce domain-specific length rules at the database level

Bytebase's SQL Review can flag column-shrinking ALTER TABLE statements and missing length constraints during change review, catching 1406-prone changes before they reach production. See also ERROR 1366: Incorrect String Value for a related charset-driven variant of the same class of error.

Edit this page on GitHub
Contact Us