By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Why Converting Graphs to Python Code Improves AI Reasoning | HackerNoon
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > Why Converting Graphs to Python Code Improves AI Reasoning | HackerNoon
Computing

Why Converting Graphs to Python Code Improves AI Reasoning | HackerNoon

News Room
Last updated: 2025/04/23 at 4:22 AM
News Room Published 23 April 2025
Share
SHARE

Table of Links

Abstract and 1 Introduction

2 COCOGEN: Representing Commonsense structures with code and 2.1 Converting (T,G) into Python code

2.2 Few-shot prompting for generating G

3 Evaluation and 3.1 Experimental setup

3.2 Script generation: PROSCRIPT

3.3 Entity state tracking: PROPARA

3.4 Argument graph generation: EXPLAGRAPHS

4 Analysis

5 Related work

6 Conclusion, Acknowledgments, Limitations, and References

A Few-shot models size estimates

B Dynamic prompt Creation

C Human Evaluation

D Dataset statistics

E Sample outputs

F Prompts

G Designing Python class for a structured task

H Impact of Model size

I Variation in prompts

2 COCOGEN: Representing Commonsense structures with code

We focus on tasks of structured commonsense generation. Each training example for such tasks is in the form (T, G), where T is a text input, and G is the structure to be generated (typically a graph). The key idea of COCOGEN is transforming an output graph G into a semantically equivalent program Gc written in a general-purpose programming language. In this work, we chose Python due to its popularity in the training data of modern CodeLLMs (Xu et al., 2022), but our approach is agnostic to the programming language. The code transformed graphs are similar in their format to the pre-training data of Code-LLMs, and thus serve as easier to generalize training or few-shot examples than the original raw graph. COCOGEN uses Code-LLMs to generate Gc given T, which we eventually convert back into the graph G

We use the task of script generation (PROSCRIPT, Figure 1) as a running example to motivate our method: script generation aims to create a script (G) to achieve a given high-level goal (T).

2.1 Converting (T, G) into Python code

We convert a (T, G) pair into a Python class or function. The general procedure involves adding the input text T in the beginning of the code as a class attribute or descriptive comment and encoding the structure G using standard constructs for representing structure in code (e.g., hashmaps, object attributes) or function calls. The goal here is to compose Python code that represents a (T, G) pair, but retains the syntax and code conventions of typical Python code.

For example, for the script generation task, we convert the (T, G) pair into a Tree class (Figure 1b). The goal T is added as class attribute (goal), and the script G is added by listing the nodes and edges separately. We first instantiate the list of nodes as objects of class Node. Then, the edges are added as an attribute children for each node (Figure 1b). For example, we instantiate the node “Take out several plates” as take_out_several_plates = Node(), and add it as a child of the node take_pies_out_to_cool.

While there are multiple ways of representing a training example as a Python class, we found empirically that this relatively simple format is the most effective, especially with larger models. We analyze the choice of format and its connection with the model size in Section 4.

2.2 Few-shot prompting for generating G

Figure 2: COCOGEN uses a prompt consisting of k (5-10) Python classes. During inference, the test input is converted to a partial class, as shown above, appended to the prompt, and completed by a code generation model such as CODEX.Figure 2: COCOGEN uses a prompt consisting of k (5-10) Python classes. During inference, the test input is converted to a partial class, as shown above, appended to the prompt, and completed by a code generation model such as CODEX.

In our experiments, we used CODEX (Chen et al., 2021a) and found that it nearly always generates syntactically valid Python. Thus, the generated code can be easily converted back into a graph and evaluated using the dataset’s standard, original, metrics. Appendix F lists sample prompts for each of the tasks we experimented with.


Authors:

(1) Aman Madaan, Language Technologies Institute, Carnegie Mellon University, USA ([email protected]);

(2) Shuyan Zhou, Language Technologies Institute, Carnegie Mellon University, USA ([email protected]);

(3) Uri Alon, Language Technologies Institute, Carnegie Mellon University, USA ([email protected]);

(4) Yiming Yang, Language Technologies Institute, Carnegie Mellon University, USA ([email protected]);

(5) Graham Neubig, Language Technologies Institute, Carnegie Mellon University, USA ([email protected]).

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article How Apple’s Vision Pro boss is rebuilding Siri from the inside out
Next Article Insta360 is coming for GoPro with the X5, a new 8K 360-degree action camera
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

10 Free Delivery Order Templates to Use in 2025 |
Computing
Bored with your phone’s lock screen? This brand lets you replace it with an eye-tracking 3D game
News
Samsung Galaxy Z Flip 7 pre-orders come with a surprising benefit
Gadget
What AI Can Do, and What we Can Build | HackerNoon
Computing

You Might also Like

Computing

10 Free Delivery Order Templates to Use in 2025 |

21 Min Read
Computing

What AI Can Do, and What we Can Build | HackerNoon

4 Min Read
Computing

Chinese EV maker Li Auto starts pre-sales of second all-electric model · TechNode

1 Min Read
Computing

Nigeria is the only other country that treats Bitcoin as a security. Why?

15 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?