Tech giant Samsung and chip design company Arm are teaming up for some research into parallel packet processing technology to accelerate 6G software development.

As part of the collaborative research, Samsung Research plans to launch an open-source project with Arm to jointly develop and refine parallel packet processing technology. This technology is described as one of the ‘key software technologies in next-generation communications.’

Parallel packet processing is useful because it processes vast quantities of communication data, and so we’re told with the growth of data inherent within ‘next-generation communication’ – a term the release uses alongside 6G but which presumably means the same thing – it can contribute to the establishment of ‘flexible and efficient communication systems.’

The research project aims to ‘significantly accelerate the research and development timeline ahead of the surge of data driven by 6G environments.’

“In 6G communications, the importance of software technology is increasing. Innovation is crucial in handling the massive amounts of data that result from this,” said Jinguk Jeong, Executive Vice President at Samsung Research’s Advanced Communications Research Center (ACRC). “This technical partnership with Arm is a significant step towards revolutionizing parallel technology.”

Samsung’s ACRC facility is ‘dedicated to leading the charge in developing next-generation communication technologies’, and is involved in standardization, researching 6G technologies and software solutions such as parallel packet processing and AI, states the release.

Mohamed Awad, Senior Vice President and General Manager, Infrastructure Line of Business, Arm added: “AI is fuelling the demand for next-generation technologies like 6G, but the insatiable amount of data creates a vital need for power-efficient processing. We are leveraging our expertise in high-performance, low-power and flexible computing by collaborating with Samsung Research to accelerate the 6G software development and enable the AI infrastructure to run as efficiently as possible.”

So the project seems to be around developing software that can better handle large amounts of data. It doesn’t go into the specifics of why exactly this is a 6G thing and not more generally a networks optimisation research project, but then it’s hard to be specific about 6G since it hasn’t been created yet.

The numerically progressive naming system the telecoms industry has settled for in segmenting eras of network technology just means 6G will be inevitable at some point, unless it abandons the idea. So talking about 6G and the future needs of networks ends up being somewhat synonymous. But this project purports to be about speeding up the time it takes to get some of the groundwork down.

And while 6G hasn’t been technologically defined, one thing that will be inevitable with its emergence is a coinciding increase in the amount of network data, since that curve is just going up year by year anyway fuelled in part more recently by AI.

Original article can be seen at: