Importing from LastPass, encountered the 10k character max limit. Found the offending records, removed them, successfully imported the rest.
Here comes the fun part: I then split the offending records into <= 9000 char, attempted to create new records for them individually, but I’m still getting the 10k error when pasting some of them in. I can plainly see each has well under 10k char (wc -m ), yet Bitwarden tells me it’s over 10k. Am I missing something?
Out of interest I created a secure note by repeating the characters 1234567890 until I got the error about about “the field note exceeds the maximum encrypted value length of 10,000 characters”
The maximum BW note size is 7,439 characters (using the method I described)
I then encrypted the same message with GPG/PGP (which I know compresses before encrypting) and the result is just 227 ascii characters
That is 227 versus 10,000, both AES256. Maybe BW should compress before encrypting to help alleviate the problem of the restriction of the note size
password is password LOL
-----BEGIN PGP MESSAGE-----
This is not a bad idea, but your example is misleading:
Because of the way compression algorithms work, repeated patterns compress much more efficiently than other data (for example, I can create a “compressed” representation of the string 1234567890 repeated 750 times, simply by encoding this pattern as 750X1234567890, thus reducing the data size from 7500 bytes to 14 bytes).
Below is a more realistic experiment, using an online text compression tool. In each case, the input text consisted of 7500 characters:
Text Source
Compressed Size
Compression
Repeated “1234567890”
68 bytes
99%
Lorem Ipsum pseudo-Latin
1896 bytes
75%
Moby Dick Chapter 1
4824 bytes
36%
Random ASCII characters
7724 bytes
–3%
The Moby Dick example suggests that for English text, it may be possible to store a Secure Note that is up to 12k in length. However, for storing encryption keys and other random data, you would be better off not using any compression (since the compression algorithm actually expanded the data size).