0

We store full html pages in our table in a field called 'conteny'. Now this table is like an archive and we would be rarely using it. The 'content' field is of type 'longtext' right now and I would like to know which is the best way we could compress and keep this table's data as its taking many gbs of space.

I tried simply using the gzcompress function on the content field in php but that did not work.e.g.

 $content = gzcompress($content,9);

Also tried this but does not work:

update posts_content set content = COMPRESS(content)

Any ideas whats the best way to do it and how?

Another question here Is mysql automatically compressing the database? talks about INNODB but we use MYISAM and could move to INNODB as a final resort.

Thanking you

2
  • Take a look at this question - stackoverflow.com/questions/8228950/… Commented Feb 2, 2012 at 14:03
  • If the field type is TEXT, then the compressed binary data is likely getting corrupted by character set conversions. Use a BLOB type to store raw binary data - they're not subject to charset conversions. Commented Feb 2, 2012 at 14:23

1 Answer 1

1

When using compression the result is binary not text. You shouldn't try to store the result in a text column. A BLOB column should work with the gzcompress method.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.