xor-ing a large file in python -


i trying apply xor operation number of files, of large.
basically getting file , xor-ing byte byte (or @ least think i'm doing). when hits larger file (around 70mb) out of memory error , script crashes.
my computer has 16gb of ram more 50% of available not relate hardware.

def xor3(source_file, target_file):     b = bytearray(open(source_file, 'rb').read())     in range(len(b)):         b[i] ^= 0x71     open(target_file, 'wb').write(b) 

i tried read file in chunks, seems i'm unexperimented output not desired one. first function returns want, of course :)

def xor(data):     b = bytearray(data)     in range(len(b)):         b[i] ^= 0x41     return data   def xor4(source_file, target_file):     open(source_file,'rb') ifile:         open(target_file, 'w+b') ofile:             data = ifile.read(1024*1024)             while data:                 ofile.write(xor(data))                 data = ifile.read(1024*1024) 


appropiate solution kind of operation ? doing wrong ?

use seek function file in chunks , append every time output file

chunk_size = 1000 #for example  open(source_file, 'rb') source:     open(target_file, 'a') target:         bytes = bytearray(source.read(chunk_size))         source.seek(chunk_size)          in range(len(bytes)):             bytes[i] ^= 0x71          target.write(bytes) 

Comments

Popular posts from this blog

aws api gateway - SerializationException in posting new Records via Dynamodb Proxy Service in API -

asp.net - Problems sending emails from forum -