I am taking an image file and converting it into binary format. Then I am converting that binary as a decimal format. But according to my algorithm I want to take 50,000 bits at a time following I am explaining my algorithm.
Now problem is:
Here are 2 demos
Thanks