By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Performance Analysis of Optimization Methods on Hyperbolic Embeddings | HackerNoon
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > Performance Analysis of Optimization Methods on Hyperbolic Embeddings | HackerNoon
Computing

Performance Analysis of Optimization Methods on Hyperbolic Embeddings | HackerNoon

News Room
Last updated: 2026/01/14 at 3:57 PM
News Room Published 14 January 2026
Share
Performance Analysis of Optimization Methods on Hyperbolic Embeddings | HackerNoon
SHARE

Table of Links

Abstract and 1. Introduction

  1. Related Works

  2. Convex Relaxation Techniques for Hyperbolic SVMs

    3.1 Preliminaries

    3.2 Original Formulation of the HSVM

    3.3 Semidefinite Formulation

    3.4 Moment-Sum-of-Squares Relaxation

  3. Experiments

    4.1 Synthetic Dataset

    4.2 Real Dataset

  4. Discussions, Acknowledgements, and References

A. Proofs

B. Solution Extraction in Relaxed Formulation

C. On Moment Sum-of-Squares Relaxation Hierarchy

D. Platt Scaling [31]

E. Detailed Experimental Results

F. Robust Hyperbolic Support Vector Machine

4 Experiments

We validate the performances of semidefinite relaxation (SDP) and sparse moment-sum-of-squares relaxations (Moment) by comparing various metrics with that of projected gradient descent (PGD) on a combination of synthetic and real datasets. The PGD implementation follows from adapting the MATLAB code in Cho et al. [4], with learning rate 0.001 and 2000 epochs for synthetic and 4000 epochs for real dataset and warm-started with a Euclidean SVM solution.

Datasets. For synthetic datasets, we construct Gaussian and tree embedding datasets following Cho et al. [4], Mishne et al. [6], Weber et al. [7]. Regarding real datasets, our experiments include two machine learning benchmark datasets, CIFAR-10 [34] and Fashion-MNIST [35] with their hyperbolic embeddings obtained through standard hyperbolic embedding procedure [1, 3, 5] to assess image classification performance. Additionally, we incorporate three graph embedding datasets—football, karate, and polbooks obtained from Chien et al. [5]—to evaluate the effectiveness of our methods on graph-structured data. We also explore cell embedding datasets, including Paul Myeloid Progenitors developmental dataset [36], Olsson Single-Cell RNA sequencing dataset [37], Krumsiek Simulated Myeloid Progenitors dataset[38], and Moignard blood cell developmental trace dataset from single-cell gene expression [39], where the inherent geometry structures well fit into our methods.

We emphasize that all features are on the Lorentz manifold, but visualized in Poincaré manifold through stereographic projection if the dimension is 2.

Evaluation Metrics. The primary metrics for assessing model performance are average training and testing loss, accuracy, and weighted F1 score under a stratified 5-fold train-test split scheme. Furthermore, to assess the tightness of the relaxations, we examine the relative suboptimality gap, defined as

Implementations Details. We use MOSEK [40] in Python as our optimization solver without any intermediate parser, since directly interacting with solvers save substantial runtime in parsing the problem. MOSEK uses interior point method to update parameters inside the feasible region without projections. All experiments are run and timed on a machine with 8 Intel Broadwell/Ice Lake CPUs and 40GB of memory. Results over multiple random seeds have been gathered and reported.

We first present the results on synthetic Gaussian and tree embedding datasets in Section 4.1, followed by results on various real datasets in Section 4.2. Code to reproduce all experiments is available on GitHub.[1]

:::info
Authors:

(1) Sheng Yang, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA ([email protected]);

(2) Peihan Liu, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA ([email protected]);

(3) Cengiz Pehlevan, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA, Center for Brain Science, Harvard University, Cambridge, MA, and Kempner Institute for the Study of Natural and Artificial Intelligence, Harvard University, Cambridge, MA ([email protected]).

:::


:::info
This paper is available on arxiv under CC by-SA 4.0 Deed (Attribution-Sharealike 4.0 International) license.

:::

[1] https://github.com/yangshengaa/hsvm-relax

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article ‘Ötzi the Iceman’ was infected with common STI before death 5,300 years ago ‘Ötzi the Iceman’ was infected with common STI before death 5,300 years ago
Next Article Your Gemini history is getting more manageable with this ‘My Stuff’ update Your Gemini history is getting more manageable with this ‘My Stuff’ update
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Mass Schooling Invented “Smart” and “Dumb”: Here’s How It Happened | HackerNoon
Mass Schooling Invented “Smart” and “Dumb”: Here’s How It Happened | HackerNoon
Computing
Netflix’s first original video podcasts feature Pete Davidson and Michael Irvin |  News
Netflix’s first original video podcasts feature Pete Davidson and Michael Irvin | News
News
The Deepfakes Are Everywhere: How to Spot AI-Generated Videos
The Deepfakes Are Everywhere: How to Spot AI-Generated Videos
News
Big tech takes a backseat to big science in Washington governor’s annual address
Big tech takes a backseat to big science in Washington governor’s annual address
Computing

You Might also Like

Mass Schooling Invented “Smart” and “Dumb”: Here’s How It Happened | HackerNoon
Computing

Mass Schooling Invented “Smart” and “Dumb”: Here’s How It Happened | HackerNoon

11 Min Read
Big tech takes a backseat to big science in Washington governor’s annual address
Computing

Big tech takes a backseat to big science in Washington governor’s annual address

3 Min Read
In a World Obsessed With AI, The Miniswap Founders Are Betting on Taste | HackerNoon
Computing

In a World Obsessed With AI, The Miniswap Founders Are Betting on Taste | HackerNoon

0 Min Read
Magnitude 12s: Seahawks fans will be measured for seismic activity during playoff game in Seattle
Computing

Magnitude 12s: Seahawks fans will be measured for seismic activity during playoff game in Seattle

3 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?