Sometimes I have text files containing simple one line records that are used on more than one computer so they may get out of sync. For example, password files maintained on more than one machine, user records or the like.
file_a might contain: fred sys1 A123 dick sys1 B333 harry sys2 C433 file_b might contain: fred sys1 A123 dick sys1 B333 tom sys2 X543
The goal of the merge is to include all the lines from each file without duplicates ie:
Desired result: fred sys1 A123 dick sys1 B333 harry sys2 C433 tom sys2 X543
This is pretty easy in this simple case but there could be hundreds of records involved.
Luckily, this is a task we can do at the command line using grep. This trick works because we just treat one file as a collection of 'patterns' for grep that we don't want to see in our output. Thus to list all of the lines that are in file_a that are not present in file_b:
~/wk $ grep -vxFf file_b file_a harry sys2 C433
Using this we can produce our desired result by appending this result to a copy of file_b thus:
~/wk $ cp file_b result ~/wk $ grep -vxFf file_b file_a >> result ~/wk $ cat result fred sys1 A123 dick sys1 B333 tom sys2 X543 harry sys2 C433
Australia: 07 3103 2894
International: +61 410 545 357