Hello Suman, as Kir pointed out you could use a streaming Dataflow pipeline to archive all messages published to your topic to BigQuery, Datastore or another storage medium of your choice.
As for the initial load of a new subscriber, you could start by creating a subscription to the main topic for your new subscriber so it can begin receiving new messages as they are published. You can then use any method you desire for getting the archived data into your new subscriber. The simplest way would probably be to have your new subscriber read your storage media directly and import the archive of old messages, ignoring any duplicates. You could also create a topic for the exclusive purpose of preparing this new subscriber and use another Dataflow pipeline to read from your archive and publish the messages to this new topic, and use that for the initial load of Pub/Sub messages.
I hope this helps.